Jump to content

Drawbar

Member
  • Posts

    31
  • Joined

  • Last visited

Everything posted by Drawbar

  1. Everything is working, and ASrock will get sorted out. My first time with ASrock. There is a very good chance it is my fault. I only build these things just far enough apart to forget stuff.
  2. Sorry you don't believe. I went thru something similar when I put a 3900x on my Gigabte x470 gaming 7. The memory worked right, but the vcore and boost clocks were a mess for about 4-5 months.
  3. Thx guys, Yea, I know. That Bazooka would just get too hot with what my wife needs that thing for. Anyways..., Something to be said for Intel.., put it together and it always worked as expected. Early adopter 3x now and stupid stuff (but interesting have to admit) all the time. Sorry guys, I'm tired, it's late and the stupid thing is working. I've got some junk going on in the background now. A couple a fans maybe I could have got away with the Bazooka...
  4. Advice needed. Put together a Asrock B550M Steel legend, 3700x, 32 gb Geil ram 2 sticks. This everything is stable, but under performing. I had all these components on a MSI B350 Bazooka board and things were better. (I was selling that B350 and Flashing bios to run Ryzen 3000 and testing as a favor). I've flashed to the latest bios. Cinebench scores (15 and 20) have dropped around 15%. Problems: 1. 3700x max boost as per HWiNFO64 is 4350MHz, on the B350 I had 4400 on 3 cores. Same cooler, same temps, vcore dancing as usual. 2. Memory compatibility... The QVL for this board at 3600 is a joke when you want/need 32gb. I finally got the Geil to run at 2660. I Have a couple 16gb kits of Gskill stuff (hynix and samsung cheap stuff) that will run 2133. The Geil would run 3600 (xmp) and the other would run 3200 (xmp) on that old B350. 3. Power draw.. The B350 (same components) would pull about 50-60 watts more. Same power plan. I got to be missing something somewhere or the Asrock board isn't ready yet... I would have kept the Bazooka, but that thing's VRM gets scary hot when you work it. Questions: 1. Do these boards always get released with such a bad QVL list? Been waiting forever for B550. 2. Any advice on how to get the memory to run better- memory timings etc... I've already got a job.
  5. Thanks for the replay SB Forgot to mention PC3 has to stay mATX or I'll have to change cases too.,
  6. I have a 3 PC household. PC 1: Living room unit is a R7 2700/B450 Tomahawk/GTX1070ti/32gb 3200 rig. It does some light gaming, but since we cut the cord it's main job is to supply the TV. Overkill I know, but it's made from upgraded parts primarily. PC 2: Is my main game rig. This thing is where I allow myself to spend money that don't make sense. I enjoy flight sims and have a lot of stupidly expensive peripherals. You get the point. It's a R9 3900x/Gigabyte's top x470/RTX 2080/32gb 3200 cl 14/ 1440 @ 14hz. PC 3: This is the workhorse. It's on 24/7. My wife has a small business (REALLY SMALL). She makes some short promo videos to help her sell her paintings etc. She is learning more about this kind of stuff so more power here is in the future I think. It's a R7 1700/B350 Bazooka/GTX 1660 Super/ 16gb 3200. I freely admit that none of this was well thought out ahead of time. It was put together when needed as funds were available lol. I also have a 3700x that my son in law bought us for Christmas (still in the box). The plan was to put it in PC 3, but there is still only a beta bios for that board and I'm not sure... I would really like some advice on how to straighten this out into a reasonable upgrade path for all this stuff. -My concerns: -PC 2 is hurting for more GPU, but still doing 80-120 fps in DCS and IL-2 BoX -PC 3, My wife is out growing this thing and starting to ask questions about PC 2 and I need to nip that in the... -PC 1, What do I keep from the others, what do I ebay? Note: all PC's have EVGA Gold pwr supplies- 550 to 750W. I have about $500 in the 2020 budget. I was thinking of a B550 and put that 3700x on it, sell the 1700 and B350 Bazooka while they are still worth something. I am not one for watching hours of utube to figure this stulff out and i'm sure I've wasted money because of it. Any help and advise for what is a really small issue considering other issues in the world would be appreciated.
  7. Guess I don't understand bottlenecking very well, have to school up. Thanks. Edit; Just ran the Shadow of the Tomb Raider bench. Says 38% GPU bound. Not a cpu bottleneck there. @ dgsddfgdfhgs, Keep your head down over there!
  8. The 1070ti drops down to 80% usage or less at times, with cpu usage way up, in Shadow of the Tomb Raider for example. FPS will drop below 60 @ 1080p as well when that happens. I guess that's what I consider a bottleneck. Not a big deal in titles like that though.
  9. Here's my 1st world problem. I trying to decide which Ryzen CPU to keep between a 1700 and 2700. I got them so cheap ($129.00 and $139.00 USD, coupons etc... long story) I couldn't pass them up. This my secondary/audio/living room/light gaming/wife doing annoying things with/ unit. It does have a 1070ti and 32gb 3200 memory on a 450 Tomahawk (originally had a 2600x) though so it's a good backup unit for any of my needs. I've got the 1700 in it right now working fine with the memory at 3200 -fearing that wasn't going to work is one of the reasons I bought the 2700. I didn't expect the 1700 to be as good as it is. it runs at 3700Mhz simply by setting the multiplier. It also boosts to 3750Mhz on all cores at stock settings (eventually lol) without issue. That cpu is really under rated. I've have a 3900x system also, and I wish it behaved like the 1700 does..., more old school I guess... Anyway..., I still have the 2700 in the box and I should sell one of these. Is there enough gain to switch to the 2700? The 1700 does bottleneck the 1070ti some, a lot at times in some games (running a 4k tv @1080p for most games), but it's a 60hz screen so not really seeing that anyway. I don't like running an all core overclock, went with the 65w cpu's because that thing is on most all the time. Thinking maybe the 2700 might stretch its legs better at stock? Is there anything I'm missing? Really kinda stumped here. Leaning towards keeping the 2700 just because it's newer and supposedly a bit better. I'm only going to get away with tearing the living room apart one more time for a good long while so any advice would be greatly appreciated!
  10. If you have the time Buildzoid's channel Actually Hardcore Overclocking on youtube is going over all the x570 mobo's in depth; https://www.youtube.com/channel/UCrwObTfqv8u1KO7Fgk-FXHQ/videos
  11. When I first built a system on a B450 Tomahawk core temp worked properly (r7 1700cpu). The last bios update I did it started that showing temps on only half the cores thing like you have. I also have a system with a 3700x/Gigabyte x470 gaming 7. The temps are not working right there either on Core Temp, or any other app. They all show up, but all show the same temp and they cycle from 39c-47c at absolute idle. Something is not quite there yet with Ryzen 3000 bios I think.
  12. Is there any benefit to overclocking your CPU for gaming when it keeps the GPU at 95%+ most all the time? I've got two systems, one is a R7 1700/GTX 1070ti, the other is a R5 2600x/RTX 2080. Memory in both is at 3200mhz. Crankin' up the CPU speed does almost nothing for Heaven benchmarks or cinebench open gl, big difference in cinebench cpu for the R7 tho. Seems all I'm doing is makin' more heat, or am I missing something?
  13. In case you didn't know..., The M2B socket on the gaming 5 supports NVME only. Make sure your not trying to use a M2 SATA drive in there. The M2A socket supports both SATA III and NVME. Easiest way to tell is SATA ones have 2 notches where they plug in, NVME have only one. NVME SATA The sockets will accept either type.
  14. If you are talking about a Ryzen system... You will never notice the difference between 3200 CL14 and CL16 while doing anything but benchmarks or massive long operations like video encoding. The difference in gaming is nill. If you want to overclock your memory that Samsung B-die stuff is great. I've got 16GB of 3200 FlareX which is B-die in a Ryzen system. Getting 3400 out of it is real easy. Otherwise, just get the CL16 stuff and save the money. As others have said 16 Gb is plenty for gaming now and for a while yet I'm sure.
  15. That dual bios without a switch really isn't. It's more like a backup/restore. The only way to flash the main bios seems to be either from the backup, which it is supposed to do automatically when there is a problem, or it must be working so you can use Qflash as normal. Shut your PC down and turn off the power switch or unplug it for a while. When you start it again it should try to start from the main BIOS and flash it with the backup if something is wrong. If it just boots up look at your lights and see which bios it's on. If it's on the backup try the power/reset/10 sec thing again and see if you can get it to try to fix the main. I don't think those are meant to ever be run on the backup bios. Your only able to trick it into posting on the backup so you can flash it if needed I think. This is from older boards, but it still seems to kinda work like this. Here's the page explaining it: https://www.gigabyte.com/microsite/55/tech_081226_dualbios.htm I have yet to find a way to flash the main bios on those non-switch boards (the x470 ultra and 5) when it fails. You just have to hope the backup flashes it for you. You can at least on your board look at the lights and see which bios it is running on. I've been able to switch to the backup on those and flash it, but not been able to flash a bricked main if it won't fix it itself. It's about as non-userfriendly as things get... I ended up RMAing an Ultra gaming one for this.
  16. Gigabyte recently released new chipset drivers for win 1903 for some boards. Mine is an x470 gaming 7 and there is one. They can be fussy about the chipset/bios combo so usually there's a note telling you which chipset has to be installed before updating bios. Sometimes they don't want bios versions skipped while updating either. https://www.gigabyte.com/us/Motherboard/GA-A320M-S2H-rev-1x#support-dl-bios I see there isn't anything about 1903 for your board. Try installing these the latest chipset: https://www.gigabyte.com/us/Motherboard/GA-A320M-S2H-rev-1x#support-dl-driver-chipset Hope that helps. OR, you can uninstall 1903 and wait...
  17. Just picked up a new Ryzen 7 1700 for $130.00 USD free shipping! AAAwave $149.99 with $20.00 off if it's your first order with them. Going to put it together with some of my spare/old parts for a backup unit. Putting this together with a Gigabyte x470 aorus ultra gaming, 16gb (2x8) Corsair Vengeance DDR4 3000 (Hynix cl-15 stuff). I'd appreciate it if someone could save me some time..., -Will the 1700/x470 run the memory 3000mhz?...with or without using the XMP? -Is anyone running a 1700 run at 3.7ghz with the stock spire cooler? Thanks
  18. I'm near-sighted as well. I also had eye strain issues while not wearing my glasses at the computer. With my laptop I can see ok with out my glasses. My desktop screen is 27" so it's a little farther away, about an arms length + 6". That's a little too far for me with out glasses and just right on the edge with my regular glasses, but not comfortable. I suffered with them on or off. I finally had a set made just for sitting at the desktop computer. It worked out that it made using a laptop more comfortable as well. Best $200 I ever spent as it became a much more enjoyable experience. Eye fatigue dropped to almost nothing. I can sit for hours now without issue. I do sometimes use a blue light filter in the evening/night if the ambient lights are low. This helps quite a bit as well. Google is your friend here. Should point out that I use Trakir a lot so bifocal lenses won't work for that. They don't work well at a desktop unit anyways unless you put your monitor down lower to look thru the bottom of the lenses.
  19. I think at 1080p with a GTX 1080 you will hit 144hz on almost everything on ultra settings, certainly you will be able by lowering a setting here and there. Gsync wouldn't really be needed much. At 1440p you will definitely want Gsync. That being said you may want to consider screen size first which will make the choice easier. At 24" or less 1080p is fine, if you want to go bigger 1440p looks so much better. A 27inch 1080p screen is awful in my opinion. I'm running a 2600x, 1440p/144hz/gsync screen with a GTX 1070ti (which is really close to a GTX1080 at 1440p). I'm 80+ fps in most new titles with probably an average of 120 across the board. Gsync is necessary for me. You may also want to consider that at 1080p with a GTX 1080 you are going to be over 144fps a lot which will also cause tearing that Gsync doesn't address. Fast sync addresses this, but can introduce other undesirable stuff. My results with fast sync and gsync together have been mixed, but when they work together well it is really awesome.
  20. Seems the higher fps you can get over the 144hz refresh on the Dell the less noticeable it is. It's hard to notice it much anyways at 144hz. It's pretty awful on the 60hz panel tho.
  21. Fast sync doesn't kick in fast enough with the fps limited to the same as the monitor refresh rate. It seems to need some wiggle room. Gsync stops the tearing below the refresh rate only.
  22. I am using DP. Gsync is working properly on the Dell. I only get tearing on it above 144 fps which isn't all that often on AAA games. The laptop with the 60hz panel tears all the time because almost everything runs faster than 60 fps. This is why I have been experimenting with Fast sync which dumps frames above the monitor's refresh rate, which should, and sometimes does..., prevent tearing at fps over the refresh rate.
  23. I've been experimenting with using Fast sync and Gsync at the same time. I recently acquired a Dell S2716 which is a 1440p/144hz gsync monitor and have a GTX 1070ti pushing it. I also have a MSI laptop with a GTX1070 and a 1080p/60hz gsync panel. With vsync off and just gsync on I get lots of tearing, especially on the laptop, because of high frame rates. Turning on Fast sync with Gsync really helps certain games (especially the laptop), but puts in a small amount of input lag (not nearly as much as vsync does tho). Results vary a lot though depending what games I'm running. I've tried using in game frame limiters or using Nvidia inspetor to limit the tearing on the high end (usually have to set 10fps or so below refresh rate). This leads to terrible studdering in pretty much everything I've tried. Anyone else tried running these together? What are your thoughts and experiences?
  24. If you don't have Ryzen master you can grab this: https://www.alcpu.com/CoreTemp/ You can use that to check your cpu temps and clock speed easily. Your cpu should be below 45C at idle or with little load. If it's over 60C at idle you got a problem. Run windows update and let it do it's thing. You can find it by searching. Go to your mother board manufacturers website and download all the latest drivers for your board and install. Just doing those should get things straightened out.
×