Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

jones177

Member
  • Content Count

    1,559
  • Joined

  • Last visited

Everything posted by jones177

  1. Your graphic score is less than half of what my slow(EVGA XC) 2080 ti(16040)does so something is up. A water cooled card should do about 16250. Run MSI Afterburner and see if the card is reaching its power limit. At stock that would be 100%. Also make sure that it is not stuck at 1350mhz. This is a none bug. Your CPU score looks fine. My 5gz 6 cores do around 8000.
  2. For high end I would not go with a Strix motherboard. I would go with a AORUS board. If it has to be ASUS go Hero. An 970 EVO only benefits open world games that load as you move around the world. Only my modded games that use 4 and 8k textures or 10 X NPCs use the extra speed of the EVO. For all non modified games the SATA SSDs on my other gaming computer do just fine. That may change with games designed for the new consoles. For the high end get a 2tb 970 EVO. For games now get a 4tb SATA. Since I mod games I would only buy 2tb EVOs. The boost clock on the ASUS RTX 2080TI O11G Gaming is too low for me to consider it high end. A 2080 ti it has to run at around 2040 to 2100mhz with a 24/7 overclock to be considered high end. My EVGA FTW3 Ultra can do that with the default fan curve but my EVGA XC can't come close. It also needs a power limit over 124% so it can get the power to maintain high clocks. A better choice wound be a EVGA Hybrid since it has a higher boost clock(1755) and would stay a lot cooler and that equals higher mhz. It is also on sale at $200 off. I use 3200mhz ram but I am starting to think it was a mistake. If I was buying now I would go with 3600mhz ram since if I decided to go Ryzen next gen I would have ram better suited to it.
  3. For that CPU the EVGA XC Ultra would be perfect.
  4. I used a AORUS Ultra and it works well with 5ghz on all cores. This gen I would only buy AORUS. Last gen I only bought Heros.
  5. I replaced my GTX 1080 tis with RTX 2080 ti and I have been happy with the decision. They gave me the frames I wanted. That was in 2018. I would not by one now unless it was a replacement. If there was a big discount I would say yes. If you still use the i7 6700k that is another good reason not to get a 2080 ti. My i7 8700k stock on all cores(4.7ghz) with a EVGA FTW3 Ultra could not beet my i7 8086k at 5ghz using a EVGA XC. The FTW3 Ultra cost $300 more than the XC and runs 10c cooler. With the 2080 tis the performance drop off is like a cliff. It is not even comparable to the 1080 ti because of it(I have 3 1080 tis). I replaced the i7 8700k since I lost the silicon lottery with it. Only the 2080 tis that have a boost clock of 1635mhz or higher are worth buying. The ones with less usually only have 112% on the power limit and run too hot. A 2080 ti needs 124 to 130% on the power limit to worth buying at all. That leaves out most of the cheap 2080 tis. The minimum is a card like my XC but it does run hot overclocked and if I would had known I would have gotten the Ultra version. Also if you still mod Fallout 4 a 2080 ti will not help much getting frames but a 5ghz 6 or 8 core upgrade transforms the game. The lows in Boston and around built up settlements are above 60fps so they don't exist. The game is smooth everywhere even with 10 time the NPCs than vanilla(60 feral ghouls).
  6. I do 50% amazon, 40% Newegg and 10% manufacturers. Their product descriptions are poor and sometimes fulse so I carefully research anything I buy there. The draw is that I am a Prime member so fast, free shipping. The downside is that they have some laze drivers.
  7. It depends on what your 2080 ti runs at. I tested between a my two gaming computers. At the time they used the same motherboard ram and SSD so the only difference was the CPUs frequency and the cooling solution of the 2080 tis One is an i7 8700k at 4.7ghz all cores with an EVGA FTW3 Ultra 2080 ti running at 2040mhz average overclocked. The other is a i7 8086k at 5ghz on all cores with an EVGA XC 2080 ti running at 1950mhz average overclocked. The computer with the 5ghz CPU won every test even though the 2080 ti was slower and cost $300 less. It made the FTW3 Ultra pointless. I only noticed the difference in performance because I have 2 gaming computers. When they were both were running EVGA 1080 ti SC2s the difference between them was 3 to 5 frames in games and benches. The 1080 tis made the 5ghz CPU pointless. I replaces the i7 8700k with another i7 8086k and now the FTW3 Ultra always wins. At hybrid temps and good silicon your 2080 ti should run between 2085mhz and 2115mhz overclocked. If you want all the performance get an i9 9900k and overclock it to 5ghz on laa cores or more. If you don't mind losing a few frames get the R 3700x. The difference in gameplay does not exist in vanilla games. In building and heavily modded games it can mean the difference between smooth and stutter if taken to the limit.
  8. If you are using a 5930k then you will have a bottleneck problem. The 5930k has about the same IPC as an i7 4770k or a R 3 2200G and that is not suitably at all for a 2080 ti. Even my i7 8700k that ran perfectly with a GTX 1080 ti had issues with a RTX 2080 ti but my i7 8680k that is a 5ghz chip did not. For picking a CPU for a RTX 2080 ti I use Cinebench R15 single core test and I consider any CPU with a score of less than 216 unsuitable for a 2080 ti. So for Ryzen a R9 3900X or a R9 3950X. For Intel an i9 9900k or better. I play building games like Space Engineers and modding games like Skyrim and Fallout 4 and high IPC and frequency is a must if you like to push it and stay smooth.
  9. I like it on my OLED in movies going through the TV. Through Windows it is just plain bad even on the OLED. It is literally night and day. In games it is about the same on my OLED TV and my HDR 10 monitor since they go through Windows.
  10. I bought an expensive adaptor and had the same issue. I did some tests and found out the reception was bad under the table where the computer was even with 4 aerials. I put the aerials on top of the table and now it is as fast as a cable.
  11. I would only go Intel if you were going for a i9 9900k along with a RTX 2080 ti. My 2080 tis like 5hz but my GTX 1080 tis did not care less. Also 6 core with no threads is a dead end so a i7-9700K is the minimum this gen. I would not get a 1tb 970 EVO unless you need it for productivity. I have the 2tb version in one of my computers. Only my heavily modded/building games use the extra speed. A 2tb SATA would be a better choice if you don't mod games. The only reason I am not using my 2011 builds PSU(750 watt) is because I wanted to go modular. Only my overclocked RTX 2080 ti uses a silly amount of power so your old 650 should be fine.
  12. My modded Fallout 4 is more fun than any vanilla game I have played. Right now I am playing Space Engineer. It is fun if you like building things. I am not using mods yet but I will in the future. Today I am setting up a server for my Son and his friends. I am hoping that my i7 6700k is up to the task since I have a motherboard for it but if not I will use an i7 8700k with a new motherboard. I won't build on the server but I may take part in space battles as a missle sponge.
  13. I have both and can't tell the difference but I play at 1440p 120hz, 1440p 144hz and 4k 60hz. The 144hz monitor is a LG 32" with a G-sync module and my LG OLED TV is G-sync compatible. Both are better than my Freesync monitor with a Nvidia GPU but my GPU is capable of doing the frames at the refresh rate of the monitor so I rarely have an issue.
  14. CPU usage is a bit meaningless on its own since some games use more CPU than others. What I do is play at different resolutions to take load off the GPU and on to the CPU. Here is what it looks like in AC:O. At 4K the CPU is working less than it is at 1440p. There are no numbers in the 30s and 40s at 1440p. At 4k the GPU is working harder than at 1440p. Even though the usage percentage is about the same it is running hotter and using more power. The GPU never gets close to 130% on the power limit at 1440p but at 4k it stays there most of the time. With Control the readings are more in line with older games. The game is using one thread to do the rendering. That is thread 11. At 4k the CPU is working less as usual. The GPU is running about the same with almost half the frames at 4k. AC:O uses more of the CPU so the frame rate difference between 4k and 1440p is not much. With Control not much of the CPU is used so the frame rate is much higher at 1440p. Also not the temperature of the CPU in both games. With a game like Control your 4 core would be fine but with games like AC:O you would be horribly bottlenecked. With my i7 2600k and i7 6700k at over 42% usage I was losing a serious amount of frames with 1080 tis and that is why I went 6 core and 5ghz plus. The cores are for games like AC:O and the ghz is for games like Control.
  15. Every stick did work. The 2 computers that took 4 sticks got them. The other 2 got 8gb sticks. The difference in 3 of the motherboards were the revision numbers. When I was a business I bought lots of motherboards and ram. The ones that gave me trouble I remember but the ones that didn't I don't remember at all.
  16. I site down to play because I am old and play games like Skyrim for hours at a time. For some games like Fallout 4 I had had to download an app called "openvr advanced settings" It has height adjust, floor adjust and controler adjust. This works for Steam games. I don't play other games so I have no knowledge. When I play Skyrim I play as a mage and my elbows never leave the arms of the chair. I use followers to do the close in stuff as I play the healer. Shooters are a bit harder but I have not explored the controller adjust feature. If it allows less movement to aim it would be ideal.
  17. I bought 3 Gigabyte P67 boards and one Z77X board for 2 i5 2500ks and 2 i7 2600ks. All bought several months apart. All had 2 sticks of 4gb ram at first but when it came time to upgrade to 16gbs only the two newest boards would take 4 stick and with the old ones I had to buy 8gb sticks. Bios updates did not do a thing. I bought the boards in 2011 and updated them to 16gbs in 2016 so out of warranty. I have had similar issues with ASUS A8N-E boards(socket 939 AMD) going from 4gbs to 8 so it was not new to me back then. The ASUS P5N-Ds(Intel socket LGA 775) that replaced the AMD boards had issues as well but I only tried to upgrade 2 out of 4 bought. I have not put 4 sticks in any of the boards I have bought since 2016(LGA 1151) but if 50% did not work I would not be surprised.
  18. I use a 3840 X 1600 Ultra wide and as far as I am concerned that is all currant high end hardware can handle on AAA games. There are 75hz versions for around $1000 and 144hz versions for around $1800. At that resolution I do the Shadow of the Tomb Raider bench averaging 105 fps on Ultra preset. RDR 2 bench using Hardware Unboxed setting averages 88fps. AC:O on Ultra averages 65fps. I played it at a bit lower settings and got around 80 average. That is with a 5.1ghz six core and a FTW3 Ultra at 2070mhz most of the time. I am going to stick with the resolution and see what improvement next gen card have to offer. I also run a B9 LG OLED with a 2080 ti and it will be running at 120hz by the end of the year. I don't think even a 3080 ti will avage a 100 with the settings I like but here's hoping.
  19. I use a B9. It may be Widows settings getting in the way. My gaming computers are set to never sleep and turn off the screen so the monitor/TV can do it's own thing. The LG OLEDs have lots of features to prevent burn-in. These come into play when the TV thinks there is no activity. Is the banner coming up that says "Instant Game Response Launched"? If not the TV does not know that you are playing a game. I did buy a 2.1 HDMI cable for my B9. It is an AUDIANO 8K HDMI 2.1 Cable.
  20. Get the G-SYNC Pendulum Demo. https://www.nvidia.com/coolstuff/demos#!/g-sync Run it and in the top left hand corner you will see Vsync, No Vsync and G-sync. If you have a G-sync or G-sync compatible monitor it will default to that. If you have a Freesync monitor with Freesync tuned on it will default to No Vsync and the pendulum will be smooth. If Freesync is turned off only vsnc is smooth. Most LG monitors have Freesync turned off in the monitor menu by default. It is under "Picture/game adjust" to turn it on.
  21. Yes. The most critical time is when you are upgrading a CPU. You are not going to get all the dust out of your computer before upgrading and one short hair or eye lash could fry a CPU or motherboard. When it happened to me it was an eyelash. At the time I was building and upgrading quit a few computers and after a clump of dust did the same thing I stopped building, upgrading and fixing other peoples computers for free. Now I use purpose built blowers along with canned air and I always inspect with a magnifying glass before inserting a CPU.
  22. I use cheap WD Passports for backups. I have gotten 1 and 2 tb drives of the price of a game($60 for 2tb now in US). The next one I buy will be 4tbs. They are $129 now but I have seen them go for around $70 on Black friday. I don't use special software to copy files. I just use the mouse. I have never had one fail yet but they are only spinning when I need something from them and that is rare.
  23. AC:O is strange. The difference between me playing at 4k or 1440p is only about 10 frames. I like this guide for getting the most out of the game.
  24. Thanks for posting the video. I had GTX 980 tis in SLI and they did beat my GTX 1080 ti in my SLI games. Two GTX 1080 tis did the same against my RTX 2080 ti. It looks like I may scrap my SLI plans for the 2080 tis when the 3080 ti comes out.
  25. I had that sort of thing happen with my 980 ti setup and it ended up being the SLI bridge. Fortunately I had another and it worked. When I did 1080 tis in SLI I had a fancy bridge with a logo on it. Only one card was recognised. I flipped the bridge so the logo was upside down and it worked perfectly. You have the right motherboard for the job, it may be something simple like it was with me.
×