Jump to content

davidq

Member
  • Posts

    16
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Male
  • Location
    Ontario
  • Occupation
    3D Game Artist

System

  • CPU
    Intel i7 4770k
  • Motherboard
    MSI Z87-GD65
  • RAM
    16GB 2133Mhz Patriot Viper Venom Red
  • GPU
    MSI R9 280X TwinFrozr Gaming
  • Case
    Fractal Design Define R4
  • Storage
    250GB Samsung 840 EVO & 1TB Western Digital Green
  • PSU
    Corsair AX860i
  • Display(s)
    3 x Acer H236HLbid 23" IPS
  • Cooling
    Be Quiet! Dark Rock Pro 2
  • Keyboard
    Mionix Zibal 60
  • Mouse
    Razer DeathAdder 2013 Black Edition

davidq's Achievements

  1. The problem is the glitching as shown in the video. I have it connected via DP
  2. Current specs: CPU: AMD R7 3700x Mobo: Asus ROG Strix B550-f Gaming (wifi) GPU: MSI RX 5700 XT Gaming X RAM: Corsair Vengeance LPX 32GB 3200MHz CL16 SSD: WD Black SN750 1TB PSU: Corsair AX860i Primary Monitor: Wacom Cintiq Pro 24 Secondary Monitor: Acer B346C As the title suggests, I've been getting glitching issues with my (fairly new) 5700 XT on my (not so new) ultrawide secondary monitor (see video for example). I've tried various troubleshooting methods scoured from the interwebs to no avail.. DDU installed latest drivers installed latest bios disabled all the AMD enhancements in the adrenaline software tried using a different input on the graphics card tried using a different input on the monitor tried using a different cable disabling OC on GPU disabling XMP reseating the GPU using 2 separate PCIe cables to power the GPU switching the PCIe slot to gen3 in the bios changing the resolution on the monitor (<-only thing that worked to get rid of the glitching, but I'd rather not torture myself with 1920x1080 on an ultrawide) tried using the ultrawide monitor on a system with a MSI GTX 1070 ti Gaming (no glitching) tried using the ultrawide monitor on a system with a MSI GTX 1660 Armor OC (no glitching) tried using the ultrawide monitor on a system with a MSI R9 280X Gaming (no glitching) tried using a Dell P2417H as the secondary monitor instead (no glitching) using the Cintiq as the only monitor (no glitching) using the ultrawide as the only monitor (glitching present) At the very least it only seems to be visual artifact, but it's quite distracting while doing work. It leads me to believe that the problem lies with the combination of the ultrawide monitor and the 5700 xt. The most similar situation I've found online was someone using an Acer ultrawide monitor as well (different model) but he was able to fix his issue by limiting it to 60Hz (my monitor is locked at 60Hz on 2560x1080). I'd like to think nothing is wrong with my ultrawide monitor as I've tried it on various systems without the glitching present....ALTHOUGH, when I did have it plugged into the 1070 it via DVI, the computer would recognize the monitor but the screen would be black (however, the hdmi/dp ports would work without a problem). I also realize that the 5700 xt family of gpus had a rough past when it came to drivers.. perhaps it's a snowflake issue that never got addressed? I would've returned the GPU by now and got myself a 3080 if the gpu stocks weren't so messed up at the moment. At the same time, if it isn't a problem with the gpu, perhaps my monitor needs an updating? Has anyone had any luck or woes when it came to connecting ultrawide monitors to a 5700 xt? If anyone's got anymore insight on the problem, I'd very much appreciate it. 473745119_5700xt_glitching(1).mp4
  3. My rig: Patriot Viper 3 2x8GB DDR3 2133MHz MSI Z87-GD65 GAMING Intel i7-4770K Corsair AX860i MSI GTX 1070 ti GAMING be quiet! Dark Rock Pro 3 First of all, I understand that mix-matched RAM is inadvisable, but it's been 6-ish years since I built this system and finding the exact pair of sticks is no longer an option. Due to working from home I'm finding the need to upgrade my RAM, so I appreciate any advice on what my best option would be. So far I'd say I've got about 3 options... same speed, same brand (also has same voltage), or same timings (also has same voltage). I don't care so much about negligible performance gains/losses, more so the reliability and longevity. If you know of a better option, please let me know (keep in mind, I chose those sticks above because of the cpu cooler clearance). I'd also rather not start from scratch and buy a 32GB set. Just a bit about what I use my computer for... I work in animation, but I do anything from drawing in Photoshop to compositing in ToonBoom/After Effects. If I'm not working, I'm gaming (Battlefield/COD/RedDead/AssCreed). My system is not overclocked and I believe an extra couple FPS in gaming is not worth the hassle, but it will bother me if a brush stroke in Photoshop isn't instant. Thanks in advance!
  4. Hey folks, What CPU currently in the market is most suited for a 280x. I have a spare card that's collecting dust in my closet and was hoping to build a gaming pc for my living room. Thanks in advance~
  5. I just want a second opinion on my situation. On my main computer, I'm housing a MSI 280x which is powering an eyefinity setup (5760x1080) plus an auxiliary (12" cintiq). The most tasking things I do 3D modelling and gaming (anything from Counter-Strike to Star Citizen). I've been contemplating for a while about upgrading my graphics card. My plan: purchase another 280x (preferably the same one) and run on crossfire. With all the sales going on, it's getting more and more tempting. And with the current Never Settle: Space Edition promotion, I was hoping to get my girlfriend started with Star Citizen too. I showed her the Mustang Omega model and she loved it. Although, in a year or two, when Star Citizen (hopefully) comes out, I plan on upgrading again to a more powerful single card solution (who knows what's out then) and transferring my crossfire cards into my secondary system, replacing its Gigabyte 6850 (powering a single 1920x1080). Hopefully then, 280x crossfire can still handle Star Citizen, but until, I would at least be able to run my games at >60 FPS on my main system. I realize single card solutions are always 'better' and that some 290x cards are matching 280x prices with the current sales, but in 2(ish) years when I upgrade again and hand down a 290x to my secondary system, I would have a 280x just lying around. If I have two 280x's now, I could at least crossfire them later on the secondary. Also, there are the new nvidia cards, but no Never Settle: Space Edition with a Mustang Omega . Like I said, I just wanted a second opinion. Or someone to tell me I'm about to make a huge mistake....
  6. Hey guys. I just got a new headset, but I'm experiencing some buzzing noise when I plug in my headset's mic. My setup goes as follows: >Computer >USB >TASCAM US-122MKII >Speakers >FUNC HS-260 (headset, audio only) >3.5mm (sound card on the MSI Z87-GD65) >Mionix Zibal 60 (keyboard) >FUNC HS-269 (headset, mic only) I find that this is the most convenient setup as to avoid having to continuously plug in wires when I switch between my speakers and headset. Although, I noticed that there is a buzzing noise when I have my mic plugged into this setup. When I unplug my headset's mic line from the keyboard, the problem goes away. But when I have the headset's mic and sound line both plugged into the keyboard (instead of the sound line coming from the audio interface), the problem goes away as well. And with that set up, I'd continuously have to go into the computer sound settings and switch the default device whenever I want to switch between my headset and speakers. I'd like to believe that this setup should just work flawlessly, but it doesn't. Can anyone shine a light on a problem that I may have missed?
  7. I got 3 of those acer ones. They're great, but I find myself having to turn off my lights when I play games due to the reflection. The AOC one looks pretty sleek, but I don't like how it has no DVI input
  8. I'm running eyefinity (5760x1080) on a 280x. I play BF4/AC4 on medium-high settings, anything above that and the frames start to drop below 30. If you absolutely want to buy a new card, I suggest saving up. Other than that, I'm gonna go with Joshyy's idea and go crossfire with the same card.. that's if your PSU can handle it. It would probably be your cheapest solution to get better performance for your price range.
  9. davidq

    What to do :D

    dark rock pro 2 is pretty sweet, but you WILL have clearance issues with Vengeance sticks
  10. If you don't plan on returning your RAM or using another cooler, then do this. My dark rock pro 2 gives clearance for sticks up to 42mm(-ish) in height. Vengeance are 52.5mm. Also, taking the heatsinks off isn't such a bad idea too because once the cooler is installed, it'll pretty much cover up all the ram slots (so much for trying to colour-coordinate my sticks.. lol).
  11. I ran into the same dilemma. I suggest considering the technologies they come with (Mantle, Easy Eyefinity, G-Sync, Shadow Play.. don't know what else) and evaluate their importance to you. I bought the 280x for eyefinity and because I'm more interested in mantle. Plus, it was cheaper at the time.. stupid mining So far, I'm enjoying my 280x. It has no problem running BF4 and AC4 on eyefinity.
  12. I saw that but it didn't give any insight on the R9 series cards, since they did something different to them. I was doing more research in finding out if the adapter that came with my card was active or passive, and I came across this: http://www.pcworld.com/article/2052312/msi-radeon-r9-280x-video-card-review-amd-teaches-an-old-gpu-some-new-tricks.html "Speaking of DisplayPort, older GPUs depend on it to deliver AMD’s EyeFinity multimonitor technology. With those cards, you can connect a maximum of two HDMI or DVI displays, but the rest must be DisplayPort (or have active DisplayPort adapters). GPUs in the Radeon R9 family support up to three HDMI or DVI displays, and you can add up to three more (using DisplayPort) to rock six monitors in all." There's no mention of the R9 series needing an active adapter for a 4th display, so I'm just going to have to find out for myself ...
×