Jump to content

Borek

Member
  • Posts

    31
  • Joined

  • Last visited

Everything posted by Borek

  1. What is your current setup? That's pretty relevant information.
  2. Firefox... it's so customizable, it's exactly what I need it to be.
  3. Linus & crew rock my freakin' socks! BTW, we've seen more of Diesel's workstation than the man himself.
  4. Here's an NCIX tech tips video that might help you out.
  5. Not sure if you can call it a hobby, but I love food and I dine out a lot. Definitely not cheap.
  6. If you're going to install a graphics card in the system, is there a reason you're still going for an APU other than cost? I'm sure you can find a Phenom ii or FX processor in that price bracket.
  7. So I noticed a problem when I was gaming on Windows 8: alt tabbing out of full screened games took several seconds (5-8 sec in my case). I did manage to diagnose the problem and come up with a solution but I thought I'd share it with you guys in case you still have the same annoyance that I did. Problem: ------------- For whatever reason, using DirectX 11 in Win8 only allows certain games a 59.9hz refresh rate setting where it would be 60hz in Win7. DX9 allows for 60hz on both OS's and DX11 allows for 60hz on Win7 only. I'm not sure if this is an Nvidia-specific or even HDMI-specific anomaly. When the game is at 59.9hz and Windows system settings are at 60hz, the alt-tabbing takes several full seconds. In my experience, the results are the same with Vsync both on and off and is therefore not the culprit. Solution: ------------- One solution is to use DirectX 9 in your game and put your game's settings at 60hz to match your Windows display settings. Alternatively, dropping the Windows 8 display settings down to 59hz while keeping your game at 59.9hz makes the alt tabbing very quick as well. I guess my questions for you guys are these: Can you replicate this problem? Can you replicate this problem using an ATI card? Can you replicate this problem using something other than an HDMI cable? Can you get your DX11 game to run at 60hz in Win8? If so, how? If we find some more conditions pertinent to this problem, I'll update this post.
  8. This is actually not a problem that will be solved by throwing more money at it, be it in the form of SSDs or RAM, however copious in abundance, because the bottleneck lies elsewhere.For whatever reason, using DirectX 11 in Win8 only allows certain games a 59.9hz refresh rate where it would be 60hz on Win7. I'm not sure if this is an Nvidia-specific or even HDMI-specific anomaly. All I know is DX9 allows for 60hz on both OS's and DX11 allows for 60hz on Win7 only. When the game is at 59.9hz and Windows system settings are at 60hz, the alt-tabbing takes several full seconds, whereas dropping the Windows system settings down to 59hz makes the alt tabbing very quick. Results are the same with Vsync both on and off.
  9. Hey, I put Win 8 on my girlfriend's new computer. I've noticed when playing games that are full-screened, alt tabbing takes an unusually long time. There's no SSD in her machine, but it was definitely much faster in Win 7. Can you at some point report on whether this happens on your machine? Thanks in advance. :)
  10. I did this! I was sober at the time though, and it happened with an upsized McDonald's cup full of Coke. I actually used Vodka to remove the stickiness over the next few days. On a related note, in my first year of University, I was tipsy one night in my dorm and happened to knock over a bottle of beer on my months old laptop. The computer still worked, but I must have shorted something out in the built-in keyboard, some of the keys stopped working and I had to use a USB one for YEARS. It wasn't till late in the laptop's lifespan that I managed to replace the keyboard with one from a used Craigslist just-for-parts laptop.
  11. Slit your wrists? Please be my guest, really.. But seriously, let’s move on.. Unfounded conjecture – so let me clarify your doubts: The only thing we cannot guarantee if user expands the loop is leaks, because we can’t obviously control how the user will implement setup (just like any custom loop).. However, we have done everything in our power to make sure that connections (clamps) are as “idiot proof†as possible. Videos and how-to’s will be available upon product release showing exactly the proper installation/upgrades procedures. Correct: if you put it together wrong, isn’t it your fault by definition? However, our responsibility and commitment is to ensure that you don’t by providing clear and complete installation guide and maintenance procedures as mentioned above. Making light of this additional convenience to users is interesting. We’ll see later below that you turn this opinion around 180 Degree to suit your purpose. False: space in packaging is identical; but cost of installation is higher (human resource), so it does cost us a little more to install them. False: we’ve setup the CPU upgrade method to simplest possible. It take no more than 3 minutes to change from one socket to another. False: no cost saving here – and it’s not “introduced side channelsâ€, it’s “enlarged the spacing of side channels†so that there is no chance that novice user could puncture them by using non standard screws. We’ve seen people bolting screws thru the entire radiator… False: Thanks for the technical advice, but that wouldn’t eliminate the above scenario. Self-contradicting: It is interesting that you dismissed installing 8 screws to attach the radiator as trivial above (“well those 8 screws really made me happyâ€), and you are now criticizing to having to unscrew them to access the fill-port..what was so easy before is now becoming so difficult.. looks like a contradiction to me, no? anyways, if you use this for CPU cooling only, there is no maintenance refill for 3 years, and if you do go custom, then you are by definition an enthusiast, and a little case modding shouldn’t scare you: simple tools (a dremel or a 1†hole saw) will allow you to cut a small hole directly above the fill-port for convenient repeated access (based on the premise that enthusiasts are frequently revising their configuration to add stuff). Unfounded conjecture (?): How do you know that for a fact? Any statements so far? Any examples you care to give? Or you are just imagining? Maybe you heard it directly from the horse’s mouth? If so, how? Let me ask you straight: are you representing one of these manufacturers? or are you associated with one of them? It is so easy to publish a rant like this under the protection of anonymity.. If you do represent one of our competitors, then have the courage to say it. Disgraceful heh? How about this for disgraceful: I remember (for being there) about 3 or 4 CES ago that Cool-It (who manufactures the corsair Kit) demoed their system comparing it to a first release Apogee water-block that was at least 3 generations behind. Talk about unequal comparison.. They just wanted to prove that their AIO was better than custom kits.. so they had to pick the most antiquated of our waterblocks. In contrast, we picked the latest models from the competition available at retail. Anyways, about comparative advertising in general, you must be equally offended by auto makers comparing each other’s models on a daily basis on TV right ? Let me add in contrast to your statement that every single member of the media that came to our booth clearly stated that it was the best demo they had seen BECAUSE it placed our kit in context with some of the competition. We used the latest kits available and we never said anything derogatory. We just SHOWED the temps, using equally equipped PC’s with equal settings, and had the public LISTEN to each one. If any of the members of the media had any doubts or concerns, don’t you think they would have voiced them right there and then? From whose point of view? This was NOT an overclocking contest. It was meant to show the OC at the click of a button like most novice users will do. It also guaranteed that the lowest possible voltage would be used to reduce the heat output on each CPU so as to minimize the thermal differences due to CPU quality. This was certainly not meant -nor presented as a lab experiment, it was presented to give users an idea of the respective ranking of the cooling systems. See more comments about this below. False: we put all the fans at 1400 RPM, which also happens to be the minimum fan speed of both Corsair and Thermaltake. False. The system using the GTX680’s only used the H220 radiator. Not feasible: This was a trade show.. not a lab experiment. I can’t help it if people from the media didn’t have a db meter. Plus there was so much ambient noise that it would have been impossible to measure anything scientifically. See for example the difference in ambient noise between Linus video (using a directional mike) and Techoftomorrow video which also picked up all the room noises making it difficult to hear the systems themselves.. Why? because you say so? let the readers decide what’s best for them! You didn’t set anything straight: You voiced your personal opinions, more often than not using false or incorrect assumptions, and failed to present anything factually correct. Just a bunch of insinuations and conjectures. To take temps at max speed would require that the systems ramp up in temp for 20 to 30 minutes. There was no time for that. And comparing temps at higher speeds to see how comparable they are? Comparing temps without comparing noise levels is utter nonsense. Running the Corsair unit at 2700 rpms is simply unbearable (as a matter of personal opinion). And how can you say that? We DEMONSTRATED that it works with 2 GTX680’s and with 2 Radeon’s 7900.. Mr. Grouchon, you sir have won my heart and mind.
  12. I've been doing some reading, and apparently nVidia cards (namely the 660 ti) has smoother video playback even though AMD cards (7950 in this case) have higher frame rates; it's a phenomenon caused by latency spikes. Although the 7950 produces more frames per second, sometimes one of those frames "sticks" a little longer than others so even though the overall FPS is higher, the video on the screen looks a little choppy. It's all outlined in this article by Tech Report written on Dec 11. It's a follow up to another comparison article they wrote in August.:
  13. I'm surprised no one mentioned Daemon Tools yet. What software are you guys using to mount your ISOs?
  14. Look around for a used 5770. Kind of old, but it's a great card. Very reliable. I still use one. It's capable of maxing out Valve games, I'm sure it'll be able to tackle football manager (no pun intended).
  15. Wow! I'm simply astonished. Haha, wish I had the moolah for a new rig.
  16. My processor is a Phenom ii 550 Black Edition. The spec sheet says the max temp is 70c for that particular processor. The thing is... my cpu unlocks into a stable quad core, but when I do that, I'm inherently running a Phenom ii 955. A Phenom ii 955 has a max temp of 62c according to it's spec sheet. I have no idea what's safe when I'm running with the two extra cores unlocked. Right now, my OC is small but stable. My cooling isn't anything special though (Corsair h50), and it sits right around 60c during a linpack stress test. I'm kinda scared to push the temps any higher, as I'm not planning to upgrade my cpu until the next gen of processors come out. No guts, no glory I suppose.
  17. Ok, thanks for clearing that up. I had originally assumed max OC, but the user M-ursu was saying he got his 5ghz running 24/7 on the previous page, my jaw just dropped. Then suddenly i thought everyone in the 5ghz club was running it 24/7. lol. I could understand one guy willing to push his cpu that hard, but not a whole thread.
  18. This might be a silly question, but are you guys running at 5ghz day to day? Are modern processors really that awesome now or is that a max overclock? lol, even 4ghz on a day to day overclock would be a dream for me.
  19. I'd love to buy the pump/heat sink unit for a custom loop. The fan controller is pretty bitchin' too. But I'd prefer to use a different rad and fan set up.
  20. I haven't dual screened since XP, but when I did I used Ultramon. It had every feature I needed. That said, multi-monitor gaming support wasn't really on my list of needs.
  21. Like I said in the first post, the computer is a gaming rig and we'll be playing games other than Warcraft on it. I realize that other cards will suit that particular game, but the computer is being built to handle modern games in general. The reason I made the thread is because WoW's newest expansion has known issues on Win8, I wanted them to be very minimal before I went the ATI route as this isn't my own computer. Thanks for looking that up xKronusx, I really do appreciate it. However, that information is outdated as it describes the game's performance from the last expansion and their benchmarks were run on Win7. Anyway, I put the order in last night. We opted for a 670 in place of the 7970 and are going to get Assasin's Creed III with it for free. Thanks everyone for the feedback.
  22. I know the fault won't be with the video card. If anything, the fault will be with AMD's firmware. I already said there have been known issues that AMD themselves said they ironed out. http://support.amd.com/us/kbarticles/Pages/AMDCatalystSoftwareSuiteVersion1210ReleaseNotes.aspx My question pertained more towards the synergy between the three (card, game, and OS). I've found reports of continued issues even after the firmware update. I thank you greatly for your suggestions techfanic, I already know what those cards are capable of. But what I'm looking for is more of a validation of a smooth game play experience with this setup rather than some suggestions for video card alternatives. I already have the parts picked out and have done so for various reason, if I don't get the 7970, I'm getting either a 670 or 660ti. Basically though, I'm wondering if anyone has first hand experience with the setup I described.
  23. Powercolor 5770 pcs++ factory overclock Not so flashy but it runs WoW, LoL, Dota 2, and other Valve games.
×