Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

jones177

Member
  • Content Count

    1,395
  • Joined

  • Last visited

Everything posted by jones177

  1. Could be but it could have a cheap processor in it. My LG B9 has an older, slower, less expensive processor than the C9 that is $200 more. They both share the same screen and I could not see a difference in the image quality. If I was using it as a traditional TV I would have bought the C9 since is has quicker smart menus, but since it is hooked up to a computer I only use the remote to turn it off and on and press the Netflix button. Manufactures may be keeping their less expensive TVs slow so their expensive TVs feel better.
  2. Years ago I bought a Samsung TV with fake 120hz for one of the bedrooms. It was a disaster as a monitor and as a TV. It is still in a bedroom but I think it has not been used for years. I watched TV for years in my bedroom on a 4k monitor while my big TV on a stand in front of a sofa sat idle. Now I have thrown out the TV, stand and sofa and replaced them with 2 black 6ft X 30" tables with gaming chairs. One of the tables has a 55" G-sync compatible OLED TV on it. Now I am using my living room again. I still have my 4k monitor setup in the bedroom but I will be replacing it with a small OLED sometime this year along with the unused Samsung. It took G-sync compatibility to get TVs back in favor again for me. Now the days are numbered for my monitors.
  3. Looks perfect. I use an LG B9. LTT did a video on the B9 and it is G-sync compatible as well. That I why I bought it. What amazes me is the image quality at 1440p. It is better than my 32" LG 1440p 144hz monitor. I use that resolution with Metro Exodus with RT on. I am not worried about burn it at all. If I leave it for a few minutes it switches itself off and if i stop typing the screen dims slightly and brighten up when I resume. My setup is not pretty but it is functional.
  4. I use a G-sync compatible TV with one 2080 ti and a Freesync monitor with the other. They just work.
  5. Here is my Fire Strike score with an EVGA 1080 ti SC2. The physics score will be higher than yours since it is a 5ghz 6 core CPU but the graphics score should be about the same. You can download Fire Strike along with Time Spy for free on Steam. Click on the "Download Demo" button on the Steam page.
  6. I get poor performance with my high end desk to in that game. I use Hardware Unboxed settings to get a good image and playable frame rates.
  7. Yes. My stock i7 8700k did it but my i7 8086k at 5ghz doesn't except when I overclock the 2080 ti FTW3 Ulta and that bottleneck is at 1080p. So my EVGA XC does not bottleneck but my FTW3 Ultra does. They both use the same CPU, motherboard and ram. The difference in the cards is that the FTW3 Ultra runs between 2040mhz and 2070mhz with a 24/7 overclock and the XC runs between 1890mhz and 1985mhz. To test your rig use GPU-Z. Under advanced select Nvidia Bios. Under "Power Limit" you will see the maximum watts used. For my XC it is 338 watts and for my FTW3 Ultra it is 373 watts. Yours will be more or less depending on the model. The max limit is if you have the power limit turned all the way up on your card. If not use the "Default" number. Now click on sensors. Double click on "Power Consumption W" or use the down arrow to set it to "Show Highest Reading" then play your games and if you are not close to the power limit set on the card you are probably bottlenecking. Another way is to use MSI Afterburner and look at the "Power %" graph after a gaming session. If you set your cards power limit to 100% and you are only using 77% you are bottlenecked. Basically your GPU will use less power if it is waiting for the CPU. Some games don't use all the GPU but that is rare nowadays. The problem with task manager is that if you set the power limit on your card to 100% and run a bench the task manager will say 100%. Then if you change the card to run at 130% task manager will still say 100% and if you bring the card back to 100% on the power limit it will still say 100% Bottlenecks on 2080 tis are usually when the card has a maxed out power limit and you are only getting about 109% out of 130% so it won't show up at all in task manager. I just ran some tests and in some scenarios it went the way I said but in others it did not. If I change settings while task manager is open it won't reflect them but if you switch it off and change a setting and then switch it back on again it will show an accurate reading. It confirmed that I will not be using task manager for GPU usage but if you are not in an overclocking session it can give an accurate reading.
  8. I buy high end cards and alway watch the prices. I bought a GTX 980 ti when they came out for $695 but after the 10 series came out I got one for $454. With the GTX 1080 ti I paid $760 for the first one and $657 for the one I bought when the RTX cards came out. The mining shortage screwed up the prices on that one but it was $100 less. If for some reason the new cards coming out this year don't have HDMI 2.1 I will buy a third 2080 ti If I can get an EVGA XC for $900. That is $300 off. I do expected the EVGA Black versions to go below $900 and they would be perfect for a R 3600.
  9. I use a NH-D15 with a i7 8086k at 5.1ghz(1.34v) and it idles around 33c. I also have a i9 9900k at 5gz(1.27v) all cores using a NH-D15 that idles at 32c. I have a Dark Rock Pro 4 sitting in a box because it won't fit over my ram. I hope I remember to buy low profile ram so I can use it in the future.
  10. I think you should only buy an i9 9900k if you plan to run a 2080 ti or higher for gaming. I tested my GTX 1080 ti against the 2080 ti for quite a while and found that the 2080 ti needs 5gz in an Intel CPU to get its full potential. With the GTX 1080 ti it was not needed at all. I expect that a RTX 2080 or less would not need it either. A problem may start with next gen high end GPUs. My EVGA 2080 ti XC does not bottleneck at 1080p with a 5ghz CPU in the games/benches I tested with but my EVGA 2080 ti FTW3 Ultra does. I have to get the CPU to 5.2ghz to stop the bottlenecks and that is more than my cooling system can do 24/7. The performance difference between my XC and FTW3 Ultra is not much but it is like going over a cliff comperining the to. With the next gen(3080 ti) I expect to bottleneck at 1440p at 5.1ghz. That is what my stock i7 8700k(4.7ghz) did with the FTW3 Ultra. I use 6 core CPUs and I would not buy another one. I have an i9 9900k(8 core) rig I test against my i7 8086ks(6 core) and from the tests it is not time to upgrade to 8 cores yet but there is a point that the 6 core CPUs will become unsmooth in open world games just like the 4 cores did.
  11. I have the same problem. For some unknown reason my EVGA 2080 ti XC will not release without constant pressure on the lever and I can't do that lifting the card. My EVGA FTW3 Ultra is easier to remove from the same motherboard and it is larger. To remove the card I have to ly the computer on its side as well. I destroyed 2 motherboards with screwdrives so I am paranoid. Both were ASUS P5-Ds that I bought in 2008. I bought four P5-Ds and only one survived without damage. It is the reason I switched to Noctua in the first place.
  12. Video Card: EVGA GeForce RTX 2080 Ti 11 GB Black Video Card ($1106.98 @ Newegg) EVGA BLACK. It boosts to 1560mhz and is 112% on the power limit.
  13. Don't get the Black. It is best to stay away from cards that have a boost clock lower than 1635mhz. The Blacks also have a power limit of 112%. The XC and XC Ultra have a power limit of 130%. You need the higher power limit if you want a decent overclock or later go on water. It will also be better on a resale later on. On a budget I would go with the XC Ultra.
  14. It is the main reason I have a 16:9 gaming rig and a 21:9 gaming rig. One of my all time favorite games is Skyrim(2011 version) and it does not like 21:9 at all. It is strange since Oblivion, Fallout 3 and NV do it alot better but not perfect. For Fallout 4 I use mods to do 21:9. Sometimes the program Flawless Screen fixes the issues but I rarely use it now. https://www.flawlesswidescreen.org/
  15. My GPU overclock is good for about 10 frames and a little more heat. It is not on by default like my CPU overclock and I set it before I play a game. All my GPU gaming overclocks use the default fan curve to keep noise down but I have benching overclocks that use 100% of the fans. They are really noisy. If I am playing at 4k or RT games I usually notice if I forget to turn on the overclock but at lower resolutions and without RT I may not notice at all.
  16. Here is the 24/7 overclock on my XC. It uses the stock fan curve since it is too noisy for me with more. I only use it only for RT games and AC:O and RDR 2. The rest of the time it is 0 on the clock. The max overclock on the XC is not the highest it can go. It is the highest in can go without downclocking to the point that it actually runs slower. This happens at 84c. If the card was water cooled like the one in the video it would run about the same. Heaven is kind of strange. I use Heaven to simulate single CPU core games at low resolution. I can't use the 1080p setting to compare the XC with the FTW3 Ultra since the FTW3 Ultra bottlenecks on it even with the i7 8086k at 5.1ghz. It is sort of like the edge of a cliff. The XC is on the edge and the FTW3 Ultra goes right over. Here is the XC running RDR 2 at 1440p and 4k using Hardware Unboxed settings, Shadow of the Tomb Raider at 1080p on the highest preset and AC:O at 1080p and 1440p in the ultra preset. This is with the 24/7 overclock. The FTW3 Ultra benches at 21:9 since it uses an ultrawide monitor.
  17. The power limit on your card should allow it to use about 350 watts. My XC uses about 347 watts with its maximum overclock. Here is what it looks like. To run at 2150 mhz the temperature needs to be in the 30s. That means water with a 240mm deadicated rad. So K|NGP|N. With the overclock I use and all fans at max my FTW3 Ultra runs at 2100 mhz before it reaches 54c. It downclocks to 2085mhz after 54c and can maintain it until it reaches 62c and then it clocks down to 2070 mhz. At 67c it downclocks again to 2055mhz. I run the card with the default fan curve and it maintains 2055mhz at around 72c but at 74c it downclockes to 2040 mhz and that happens running games that use the CPU as well as the GPU(AC:O). It never goes below 2040mhz. The FTW3 Ultra can use about 40 watts more than the XC but is limited by its cooling as well. My 2 slot XC runs a bit hotter. With my 24/7 overclock with the default fan curve it runs at 1935mhz at 81c. Its cooling solution is inadequate(too noisy) for overclocking 24/7 so I usually run it at stock clock with 800 on the memory and 130% on the power limit and that is about 1875 mhz at 79c. 2080 tis performance depends on there cooling solution. My XC can overclock higher on the GPU and memory but can never outperform my FTW3 Ultra because of its cooling solution.
  18. My LG B9 Does 120hz at 1440p and that is doable now. My 2080 tis don't come close to 120hz 4k so I can wait for next gen.
  19. Have you tried running the card in debug mode? Some factory overclocks don't seem to do well with age. I know my Gigabyte GTX 1080 Xtreme had more issues as it go older and now sits in a box. The EVGA 1080 SC card I bought at the same time still runs stock but there is zero overclocking headroom on it now.
  20. My EVGA 2080 ti XC is a dual slot. I bought it so it can be card #1 if I decide to SLI. When shopping for a 2 slot card look for ones that are set to 1635 mhz or higher on the boost clock and have from 124 to 130% on the power limit. Avoid anything with less. Here is how my XC runs in Superposition stock and overclocked.
  21. I had my 2 gaming PCs in my bedroom for about 2 years. With both on and one gaming the temperature in the room would rise 3 to 5c depending on the game. Now they are back in the living room they have no measurable effect at all on the temperature. The difference is that there is next to no airflow in the bedroom and the living room has lots of air movement. What benefits the most is my EVGA 2080 ti XC. It was running in the low 80s in benches in the bedroom but is now in the high 70s in the living room. I do mis the computers on cold nights and the stock i7 6700k with a GTX 1080 that I have in the bedroom now is no match in heat generation compared to the two 6 cores at 5ghz and the two 2080 tis.
  22. I have about 6500 hours on Fallout 4. It is my fourth most modded game with 165 installed. I don't play the main quest or the DLCs but build and protect my settlements. I have sellers at killable and NPC spawns at 5X. So for every 1 enemy spawned vanilla, 5 are produced. This means enough radroaches to bring down a brahmin and swarms of blood bugs large enough to bring anything down. This all comes at a cost since character saves over level 100 have to go on my computer that uses the 970 EVO SSD since they stutter on a SATA SSD and it is the only game that actually needs the 5ghz on the CPU to keep frames over 60 at 4k. When the game came out I could run it at 60fps 4k with a GTX 980 ti and a i7 2600k. Now it takes a RTX 2080 ti with i7 8086k both overclocked and I have to limit the spawns to stay stutter free. I prefer 10X spawns but after level 30 the hordes of enemies create a moving framerate drop as they go down a road. What I like to do most in Fallout 4 is revisit my saves I started when the game came out. These saves are in the 140 plus levels and are now waiting for new hardware to make them playable again. Now I spend most of my time playing Space Engineers. I only have about 250 hour now but I can easily see getting in the thousands with it.
  23. I did photo editing to print for years and Photoshop tends to hide things like compression artifacts if they are not viewed at 100%. The last monitor I bought for it was a 32" VA 4k LG monitor. It is 10 bit (8 bit + FRC) and cost only $350. Since most output went to print I did not need a super color accurate monitor. Most images had elements that had to be pantone matched in CMYK so spotting color shifts in the conversion was more important. Before the 32" LG I used a 28" Samsung 4k TN monitor that I bought in 2015. I did not like the size at all but it was a lot better at the job than the 27" 1080p monitors I used for years. Now my monitor of choice is my 55" OLED TV. I am retired now so I don't use it for work but if I did start again it is what I would prefer.
  24. When Fallout 4 came out I already had a 4k monitor and could not understand why people thought it looked like crap. I then played it at 1080p and understood why. It is sort of like the developers all used 4k monitors and never looked at their work at a lower resolution. Take trees for example. There are not enough pixels at 1080p for their tree designs to work so they look like a anti aliasing nightmare. Even at 1440p they don't look great. The game does work at 1600p but I still prefer to play it at 4k.
  25. I recommend using the EVGA forum. They will let you know if it is RMA time.
×