Jump to content

pokgoqese

Member
  • Posts

    20
  • Joined

  • Last visited

Everything posted by pokgoqese

  1. So I opened it and cleaned it, and it works fine now! At least for now. Can't say why it was acting like this tho, as the mouse wheel is completely separated from the left mouse button. But there was some dirt on one of the tiny components, so maybe it was "shorting" or something.
  2. OK, I will try and let you know. Just a small correction, it only stops dragging if I hit the left click button.
  3. It's 21 months old. A4tech Bloody V8. Not worth RMAing for the price. I may open the mouse, tho I don't feel any changes in the clicking. Maybe it just needs cleaning indeed. The screws are behind the metal glide pads tho, which is annoying. It keeps dragging even if i let go of the wheel, only way to make it stop is to do a left click.
  4. Hi, today my mouse started giving me this problem, when I scroll with my mouse wheel, it also does left click and drag. Is there any way to fix this? Perhaps I accidentaly hit some keyboard combination and enabled this feature or is the mouse wheel simply defective?
  5. Make's sense, thank you. The better cooled ones will probably have set TDP higher. I was checking MSI GE75 or 63, or the super beefy MSI Titan.
  6. I just saw it now, there are some laptops with GTX 1080 set to 200W TDP that perform best, some 190W, others to 150W... Well that's confusing... I don't know what to pick then...
  7. RTX cards on laptop are downclocked and are less TDP than their desktop counterparts. They perform worse, im not talking about Max-Q.
  8. Let's assume they are the same laptop models, same cooling and everything, same CPU. RTX2070 in MSI GE75 seems to have 75°C, not sure how GTX1080 compares in how much it heats
  9. Which one of these is stronger? I heard GTX 1080 is stronger, because the laptop RTX2070 is gimped down compared to the desktop version, but I can't find any benchmarks. I'm not talking about the Max-Q versions.
  10. Yes, two sets. Okay, thank you. I'm talking about replacing Capxon GF with Nippon Chemi-Con KY, but there is also Nippon KZM with higher ripple current than the original, and 10k hours lifetime. Nippon has around 150 different capacitators series, and about 12 series made for power supplies like this, so it is really difficult to determine what would be the best choice, so I just stick to that 470uF/25V... claims to have low ESR as well. I don't know if they are better, but they have larger lifetime rating. The originals lasted for 12 years tho. By too big you mean their capacity in uF? Because I thought their size does not matter as long as the capacity is the same. I will stick to same capacity and voltage, just to be safe.
  11. LCD Monitor's power supply. There are 3 470uF/25V in parallel. And other 3 470uF/25V in another parallel. As I don't know much about these things, I'm just trying to get the same exact specs of the caps.
  12. The original have ripple current of 1260mA, the ones I wanna buy have ripple current of 1210mA, otherwise they are all 470uF/25V 105°C. Originals are rated for 2000 hours, while the ones i'm trying to buy are rated for 7000 hours. Someone said I can go higher over the original, but not lower. I can't find more info about this. Anyone knows?
  13. I guess I will buy 75Hz in the future, unless 144Hz will become cheaper in here (75Hz is already a big improvement for me over 60Hz). Can I switch to 60Hz mode on these monitors without underclocking manually? Just in case FreeSync won't fix the micro-stutters in this situation, because micro-stutters at 60FPS @ 75Hz are really noticeable to me and Vsync or Enhanced Sync does not fix them for me. But I believe FreeSync should fix it? If I remember correctly FreeSync is basically variable refresh rate? Or something like this... But I could be wrong. I fixed the GPU working harder. I set the custom resolution to CVT Reduced Blanking (it seems to be the timing standard used by my monitor at 60Hz). It increased G.Pixel Clock and reduced G.Refresh Rate from 75Hz to 74.928. GPU sits at 37°C on idle now with this. Before this small tweak it was sitting at 51°C, only because the GPU activated fans now and then to keep it there, otherwise it would go higher... Anyway, 30-45FPS at 75Hz still looked better than 60Hz, but 60FPS is smoother at 60Hz. I was testing Resident Evil 2. and Assassin's Creed games.
  14. Thank you for the explanation. Then I guess it's not completely a bad idea to get a 144Hz monitor even if I won't reach 144FPS. Though I noticed that when I overclock the monitor to 75Hz and boot up game that supports only 60Hz mode, the game is a bit choppy running at 60FPS, but when I switch monitor to 60Hz the game runs smoothly again. But I think most, if not every new game should support 75Hz mode. By the way, won't higher refresh rate monitor put more stress on my GPU? I ask because when I overclock the monitor to 75Hz, my GPU idles at higher voltage and temps... I don't know if it is because of the overclock feature. I wonder if the same would happen If I bought a native 75Hz monitor without overclocking it.
  15. Hi there, I'm asking because I overclocked my monitor from 60Hz to 75Hz, and when I play a game where I have ~40FPS the game feels a bit smoother and less choppy when playing in a 75Hz mode instead of 60Hz at that ~40FPS. I always hear you have to reach the monitor's refresh rate with FPS to see a difference, but it feels smoother and less choppy to me even at that ~40FPS. Playing games at 75FPS @ 75Hz definitely feels way more smoother to me than 60FPS @ 60Hz. I cannot imagine what 144Hz must look like! :D I can't go back to 60Hz anymore.
  16. I'm not sure what do you mean by "at boot", but the voltages I reported were while gaming. IDLE voltage stay at 0.750V both with AMD defaults and BIOS defaults. RX 580 is locked to 1.2V max, so I was surprised I get this voltage... especially when BIOS defaults are 1150mV max. I know about the Afterburner, thank you. I used it for monitoring, but AMD has their own monitoring overlay. I just switched the voltage settings to manual in Wattman.
  17. Ah ok... Thank you very much. I was worried something was wrong with my card, because I always saw the max 1.15V voltage for 3 months since I got the card, probably because I used Trixx before and hit the default button. Yesterday I was playing around with Wattman then installed a new driver and noticed my voltages run at 1.2V which scared me. Purging everything, even with DDU did not help. I think the AMD driver should load BIOS defaults tho. It just uses more power and produce more heat for no reason. How is it with NVIDIA? The driver ignores BIOS settings as well?
  18. Yes, the BIOS's max is 1.15V for the last 3 states. So are you telling me the voltages are normal and it is not a bug but a feature? Shouldn't the AMD driver respect the BIOS specs?
  19. Hello, Is it normal that AMD drivers raise my RX 580's voltage to 1193mV up to max 1200mV? Because if I hit the "default" button in Sapphire Trixx, max voltage won't go over 1150mV. And if I try to edit the voltages in AMD Wattman, the 1150mV are preset there too in last three states, even before installing Trixx. So why is AMD Wattman raising the voltage to 1193mV and then to 1200mV on default/auto settings? Is this a feature or a bug? Happens on 18.9.3. driver and also the newest driver 19.3.2. and some older ones. Only way to get the 1150mV is to tell Trixx to use default settings, or switch voltages to manual in Wattman.
×