Jump to content

jones177

Member
  • Posts

    3,560
  • Joined

  • Last visited

Everything posted by jones177

  1. I have a 3090 ti and I found that it exceeds the bandwidth of some monitor cables and the screen will go black. I had to replace a cable that worked fine with a 3080 ti and a 3090. To test to see if this is your problem lower the hz of your panel or raise the resolution.
  2. 3080 tis are hard to cool. I ended up replacing all my gaming cases. Now I have cases with the intake blow the GPU and all the rest of the fans as exhaust. Looks like this. With one intake fan it ran Heaven Bench at 66c. It now has 2 and it runs the bench at 62c. The intake fan is a Noctua NF-A12x25. On my standard cases that have the PSU on the bottom I use Noctua 60mm fans. Looks like this. It stops hot air getting trapped below the GPU. This is my 5900x/3080 ti setup. It will gat a 3rd intake fan at some point since the old lian li o11 dynamic is a bit warmer than the new EVO version.
  3. It read like a mini nightmare for a while. I am happy it turned out well for you.
  4. They are not and 2X 4k monitors is ideal. The only ultrawide that I consider usable are the 38" 3840 X 1600 IPS versions. I am using one now. I have a 3440 X 1440 VA monitor that is not in use since it has bad text. My VA 1440p 144hz monitor also had bad text. I like ASUS monitors for productivity since the have versions that pivot in the Tuf and ProArt lines. If I was buying now it would be the ASUS ProArt PA329CV. For pro image editing I would use ASUS ProArt PA329C. It is 10 bit, not 8 bit +FRC. A ASUS TUF Gaming VG289Q 28" monitor rotated to 90% is a great second monitor.
  5. I would send this image off to Gigabyte and start an RMA. There are lots of videos on the ROG Crosshair VIII Formula bios. 1:04 ie where XMP is.
  6. I have both and it depends on the version. For example my EVGA XC3 Ultra is about 6 to 10 frames down on my EVGA FTW3 Ultra 3080 ti at 4k. I compare at 4k since it is all GPU 90% of the time. Here is Horizon Zero Dawn bench at 4k ultra. EVGA XC3 Ultra 3080 ti = 83fps EVGA FTW3 Ultra 3080 ti =89fps MSI Gaming X Trio 3080 ti =89fps MSI Gaming X Trio 3090 =89fps EVGA FTW3 Ultra 3090 ti =98fps So my MSI Gaming X Trio 3090 performs about the same as a high end 3080 ti A Suprim, Strix or FTW 3 3090 will do better than the Gaming X Trio but not by much.
  7. There is something defiantly wrong with your setup. Use GPUZ to check the power connectors. Looks like this. I am running Heaven in a window so I can see how much power the card is using from the PCie slot and the 8 pin connectors. Also do you have another computer to test the GPU in? If not you need to have it tested. Do you have another GPU to test the system? It would be to rule out the motherboard. Your CPU score is a bit low. Like XMP was not enabled.
  8. I bought one for a computer programing class back in the 80s. Later I bought a Leading Edge 386 computer that had an HD but I did not build a PC until the 486 came out and that was to play Doom. 1993 I think. I don't miss them at all since my main computer at the time was a Commodore 64 and later an Amiga 500.
  9. My 1980s 286 screen. It had the green text.
  10. The best way to test a card is to run a bench. I have a 5900x with a 3080 ti and this is how it does in Time Spy with everything stock. https://www.3dmark.com/3dm/76018750? Your CPU score would be about the same and your GPU a bit lower. This is the sort of information you will need if you decide to start a RMA since it shows your clock speed and temperatures.
  11. jones177

    LG C1

    The B9 I use for just gaming is still fine. It is 3 years old now.
  12. Win 10. I am not going 11 until I do a new build. So next year.
  13. jones177

    LG C1

    All my money at that time of the year will be for 40 series GPUs so I bought the 55" version this week for $100 more than the 48" version. They also may be out of stock by black Friday and the C2s will be at a discount. Word of warning. They are not monitors and they will burn in if used like a monitor. For TV and gaming they are great but for apps like Chrome they will burn in. The Windows start icon and lower tool bar will burn in as well. Have them set so that they disappear. What is better than the TVs nannies is the Windows built in screensaver and everything dark mod.
  14. It may be the type of work you do. If I was still working I would use 2 32" 4k monitors since I used 2 computers at the same time. One for editing and one for rendering. Going past 32" made the desk too crowded. My 3440 X 1440 and 3840 X 1600 ultrawide monitors were failed experiments for work but they did do great for gaming. The 55" OLEDs were originally bought to replace 1080p TVs I had in the bedrooms that were not used with computers. I put one on the desk for fun and it has stayed.
  15. Terrible for you. This is my setup and it works for me. So about 30" away. For work I used 32" 4k monitors and a 3840 X 1600 LG ultra wide. Before that 28" 4k monitors and a 3440 X 1440 ultrawide. I am retired now so I no longer use them but my old i9 9900k computer will get the 38" ultrawide this weekend since my Nano 85 is replacing it in a studio setup that my Son uses.
  16. I have a i9 11900k and it did use a 3080 ti for a while but now it is using a 3090 ti. It was built to run Windows 10 and be air cooled. I also had a i7 6700k on a Hero VIII board and it used a GTX 1080 as well. That was its limit so it was replaced with an i7 8086k to run a GTX 1080 ti. The i9 will run better with more cooling so I recommend a Noctau D-15s. The reason is TVB(thermal velocity boost). To use TVB the CPU has to stay below 70c. An i9 11900k with an overclock needs more cooling so a 360mm AIO is needed. Using TVB in some games and apps is better than a 5.1ghz overclock. This is what it looks like. The CPU can hit 5.3ghz on some cores. I would go with a 1000 watt PSU. My i9 10900k/3090 computer is on a meter and it has read 789 watts. When gaming it is usually at 550 watts so I do not know how the reading accrued but I am happy to have the 1000 watts. Also If you upgrade in the future you may need it. Cooling a 3080 ti is a pain. My Corsair 5000d and CM H500 P Mesh failed at it so I am using Lian li pc-o11 dynamics that use Noctau NF-A12x25 Chromax fans as intakes. They intake air below the GPU and have the rest of the fans are used as exhaust. This also depend on what 3080 ti you get. I have 3 and there is a 10 frame difference between the fastest and the slowest. The coolest is my MSI Gaming X Trio(350 watts) and the hottest is the EVGA FTW3 Ultra(400 watts). The cases that failed were on the 400 watt card.
  17. I think it is down to user preference. I started using OLEDs in 2019 so only 55" was available. The next year I bought a 49" LG Nano 85 and it is a few inches closer to me than the 55" OLEDs. Two days ago I ordered a 55" C1 since that is the size I know I am comfortable with. The 48" C1 is $100 less and the 42" C2 is only about $150 more so it was not about the price. In my experience non are a true replacement for a monitor since they eventually do burn in. That is why I bought the Nano 85. It is IPS so no burn in. Treating any OLED like a monitor will kill it eventually so it is only a matter of time.
  18. Base clocks for Intel has more to do with warranty than anything else. My i9 9900k is using the default setup with XMP enabled on an Aorus Ultra z390. So it is Multi Core Enhancement on auto. This is what it looks like at idle. It was build in February of 2019 and it got a 5ghz overclock on day one. It was with 1.272v. This is what it looked like. I had the CPU overclocked when it used 2x 1080 tis. It lost its overclock with the RTX 2080 ti since it was not needed. The overclock is only really needed at 1080p resolution. It has only used 1440p and 4k monitors since it got the 2080 ti. The important thing is keeping it cool. Mine has used a Noctau D-15 with a single fan since new and it usually games in the 60s.
  19. Right now I have my motherboard overclocking(MCE) my i9 9900k. This is what it looks like running Cinebench R20. Yours looks closer to stock. For 2 years it was overclocked to 5ghz all cores but this setup does about the same frames in games. It has used 2x GTX 1080 tis in SLI, a RTX 2080 ti, RTX 3080 and now a RTX 3080 ti. I would say the best match is the 3080 so you have lots of room to grow with it.
  20. It took about 4900 hours for it to be noticeable on my B9 and it was the Chrome header that burnt in. After that I used it only for gaming and it did not get worse. I am replacing it with a C1 this week and it will go back to doing what caused the burn in in the first place and it will be replaced next year with a C2. Time will tell how much longer these newer OLEDs will last but I would be happy with 5 years.
  21. My old B9s are bright enough for me and they can hurt without Windows in dark mod using HDR. So my new OLED is a C1 for $1,100. I will get the newer tech when it is about the same price.
  22. I did 4k with a RTX 2080 ti that doses about the same frames as a 3070 ti. It was fine in most games but in others I had to drop the resolution. Now if the game had DLSS I would not have to do that. The real problem with the 3070 ti and 4k is the amount of vram. Even a 3080 cuts it close in some games. With a 3070 ti I would go with a 3840 X 1600 ultrawide max and a 3440 X 1440 for being comfortable for a while. Another option is going OLED and play at 1440p. I did that for a year with my 2080 tis to get 1440p 120hz on them and it was great. For all frames over 60 in everything(so far) it does take a 3090 ti. It is the first true no compromise 4k card since it does 10 extra frames at 4k over the 3080ti/3090. So is it worth it. Yes and no. If you mod old games with high res textures it can be fantastic. I started using a 4k monitor with Skyrim using a 980 ti. So yes. If you play space games or high detail building games it is a must. For open world games that don't need a lot of frames like Horizon Zero Dawn it is great as well In a lot of modern games like RDR2 there is no more information in the textures that will be revealed at 4k over 1440p so it is a waist in these titles. Those games look better on a 1440p ultrawide. So no. Here is the sort of tests that I did to see if I want to play at 4k or 1440p. Notice the difference in frame rate.
  23. Sounds like staying at home(cave) syndrome to me. I was freelance and worked from home for years. If I did not have a project or my computers were rendering I would just pass out. Playing shooters helped a lot but sometimes that was not enough so it was Redbull, Coke and coffee. When a company wanted to have an exclusive I agreed but only if I could have office space(and a pension). Then I had a commute and was surround by people so no passing out. I was made fun of if it did. Now that I am retired I am usually active in the morning, nap in the afternoon then up until about 1 to 2 am. To me that is retired life.
  24. After I do the things mentioned above: I use HWiNFO64 to check to see if the cooling solution is working properly. I then run benches and take screen shots. I check on line to see if is underperforming well. I usually do this with Time Spy and Cinebench. The screen shot are used to spot changes in the system later on or post here to help others.
  25. It was an issue for me in the past. Over the years it become less of an issue for me and now it is nonexistent. It was the ghost in the machine. It happened doing things like large 3D projects and modding some games. The game it effected to most was the original 32bit Skyrim(2011). Modded to the max it would crash the hole system to the point that to recover I had to switch off at wall and count to 30 before switching on again. In game error could survive a reboot so it had to be done. These were the system specs of the computer that had the issue. i7 2600k Gigabyte P67A-UD3P G. Skill Ripsaw DDR3 2x4 ram. The GPUs I used when it was an issue were the GTX 570, 670, 680, 970, 980 and 980ti. I have that version of Skyrim installed with all its original mods on my i9 10900kf computer and it does not crash at all. Both 3D Programs I used for work did it as well on projects that maxed out the system. I used 3D Studio/Max and Lightwave. It ended for me in 2016 when I upgraded my gaming computers to 16gbs ram from 8gbs and my work computers to 32.
×