Jump to content
Phishing Emails & YouTube Messages - Fake Giveaway Read more... ×
Search In
  • More options...
Find results that contain...
Find results in...

Kuzma

Member
  • Content Count

    3,503
  • Joined

  • Last visited

Everything posted by Kuzma

  1. Kuzma

    290 100% Fan

    Bump....
  2. Kuzma

    290 100% Fan

    Installed a HG10 A1 onto my R9 290 but during the boot sequence after a while the GPU fan will go to 100% but I haven't ran it for long enough to see if it will crash. I can't actually boot into windows before this happens so I can't check the GPU temps. Any help? I've re-installed the HG10 A1 twice now just to guarantee it's not an issue with the heatsink/fan. Not sure if I should just keep running it and see what happens.
  3. Kuzma

    290 100% Fan

    Ah okay, so the slight issue is that Windows seems to be forcing a defrag. I'm a tad bit worried about sitting through the defrag.
  4. Kuzma

    290 100% Fan

    "I can't actually boot into windows before this happens so I can't check the GPU temps."
  5. Kuzma

    Freezes !

    Hellooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo ancient and inactive member here o-o anyone remember me? Anywayyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy, I am having an issue in which I get a black screen , keyboard+mouse become unusable but I can still hear people if I am in a skype call and they can hear me too. If I unplug my headset and plug it back in however this no longer happens. Any ideas? I've ran check disks, clean installed windows 10 3 times, removed the OC on my CPU. Oddly enough my GPU ran a 6 hour stress test perfectly fine but any form of DirectX interaction results in this black screen issue coming up faster than usual however it happens completely spontaneously. I don't get a crash dump 90% of the time due to the fact that my PC doesn't actually crash (hence the skype calls). Crash dumps that I have had are here.
  6. Kuzma

    Freezes !

    Appears to be fine on Linux (where I'm sending this message from). But I keep getting the issue no matter what on Windows 10.
  7. Kuzma

    Freezes !

    check my sig I have an R9 290 , anyway, I guess I could try that? It happens simply on the desktop with nothing open so I don't think installing anything should prevent a crash on the default Windows 10 programs?
  8. Kuzma

    Freezes !

    I have tried version 15.12 15.13 and 15.2 all the same issue, I've set a max temperature in the drivers so overheating isn't an issue and the issue happens at idle too.
  9. Been lurking so hard I forgot to log in....

    1. Emperor_Piehead

      Emperor_Piehead

      Lol i do that sometimes now

  10. Introduction I've seen a lot of people arguing about if you need more than 2GB of VRAM at 1080p and arguing about how much you really need. I can't do the benchmarks personally but I figured I'd leave the sources used at the end ^_^ . 2GB isn't enough right? This is kinda right but wrong in the majority of cases, games don't use more than 2GB VRAM at 1440p let alone 1080p. The main exception to this rule, I'm sure you can guess what it is :P it's Crysis 3 using 2.2GB VRAM at very high settings + 8x msaa, which can be fixed by simply using 4x msaa. Sources? Each letter will have a different link: C F H M B Conclusion As you can see from the sources above 2GB is plenty of VRAM for 1080p in the majority of cases but with the new consoles we're also going to most likely see a new tier of games with a large amount of VRAM usage; if you look at some of the games that have been showcased at E3 they seem to be using the amount of VRAM from the unified system quite nicely. A good example of this is Tom Clancy's The Division. So I think we can expect >3GB for 1440p and >2GB for 1080p but that's purely my own speculation ^_^ . Update Battlefield 4 has been confirmed to use 2.2-2.3GB of VRAM so think about that if you plan to play this game.
  11. I HAVE RETURNED.

    1. megadarkwood
    2. Kuzma
    3. Emperor_Piehead

      Emperor_Piehead

      i've just been stalking around the forum not really posting anything

  12. Introduction: You may have seen me post around quite a bit telling people not to buy the 4GB variants of the 760, 770, 660 TI or 680 or the 6GB variant of the 7970; this is due to a little factor called the memory bus size it's hard to explain it on a technical level while still making it easy to understand so I'll simplify it down to a real world analogy. What exactly is the memory bus? The memory bus is the pathway that your gpu uses to access the memory it has and is generally measured in bits (8 bits = 1 byte :P ) this works together with the memory clock speed to work out exactly how much of the memory can be accessed per second. So how will it effect my graphics card? Think of the memory as water and the memory bus as a tunnel, if you need more water than your memory bus will let through then you're going to have to wait a while to wait for that extra water to come through. If your graphics card has memory bus designed for 2GB and your add another 2GB then you've added more water without being able to get that extra water through the memory bus. What about memory clock? The memory clock is like the speed of the water, if you increase the speed of the water enough then you can push more water through the small memory bus ^_^ the issue is however that you need a pretty large speed increase to access double the water in comparison to before. Conclusion So if you're buying a variant of a graphics card with double the memory then make sure that either the memory clock is increased or you know you'll be able to increase it (gpu boost 2.0) otherwise all that extra memory (and that extra cash) is wasted. Since memory clocks generally aren't very very high a good rule of thumb is 128bits and 1000mhz effective memory clock per gb (this can change to 64bits and 2000mhz and vice versa so make sure you do your math :D to work out if you're going to be able to use all that memory) P.S. I thought I'd add in my Titan calculations for any of you mathematicians reading this (attatched as a txt) and by my calculation a titan would need exactly double it's effective memory clock speed to access all 6GB of it's memory that or it's only accessing 3GB of it's memory and since we've already seen the titan use more than that leaving me with the conclusion GK110 only needs half the memory clock to access all the GB which to me is a crazy revolutionary advancement O.o titan calculations.txt
  13. http://www.youtube.com/watch?v=F8jPui5NXc0 Just a vid I did
  14. Kuzma

    Linus league of legends rank team?

    Anyone wanna do some ranked? I hit 30 now so .
  15. Kuzma

    Need Voice Actor For History Project

    I was summoned?
  16. Two portrait monitors is surprsingly nice.

    1. gastew15

      gastew15

      It is a nice thing to have. When I get a third monitor I'd like to put it in portrait so I can put my lists and file structures on that one when I'm programming.

    2. wng_kingsley7

      wng_kingsley7

      What monitors are you using? Sounds pretty sweet.

  17. The weirdest part is according to this I have the least issues with the two colours I have the most issues with?
  18. I'm colour blind and have a BAD TN panel so
  19. I'm red-green colour blind and I'm not even gonna BEGIN to try and do this ._.
  20. Not saying they're comparable cause they're REALLY not but I'm just saying they're still 6 cores. (I like AMD too but still have my Xeon )
  21. I hate to be elitist but ssssssssssh until you know what you're talking about because Intel's CPUs also share resources. http://linustechtips.com/main/topic/43829-the-difference-between-amd-cores-and-intel-cores/
  22. Got a sore throat, my voice is deeper than usual xD.

    1. Cryptonite

      Cryptonite

      Lol, you getting sick? I'm busy recovering >_<

    2. Kuzma

      Kuzma

      I recon I was a little dehydrated :P had a drink and I'm fine.

    1. Cryptonite

      Cryptonite

      Status updates.. Y U NO HAVE LIKE BUTTON?!

    2. Kuzma

      Kuzma

      xD then it'd be LinusFacebookTips

    3. Cryptonite

      Cryptonite

      true, but not exactly.. they could also add a dislike button >_<

  23. If you want to see a good OC on baseclock just look at me OC'ing from 2.27Ghz to 4.1Ghz.
×