Jump to content

tmlhalo

Member
  • Posts

    1,078
  • Joined

  • Last visited

Everything posted by tmlhalo

  1. If it was a Steam key, try uninstalling all of Uplay and then let Steam install it and Uplay. Has worked for me in the past.
  2. Yes, two 770s are more powerful than a Titan... IF the game supports SLI which not every game does. Here are the games Nvidia choses to showcase for SLI. If it isn't here then support is mediocre for that game. In which case you effectively have only a single 770 as the other will be idle. The single 970 will perform like a single 970 regardless of SLI support since it is a single gpu. http://www.geforce.com/games-applications/technology/sli
  3. Besides the fact no one knows the specs, prices and benchmarks of either you aren't comparing 4 way SLI to 2 way crossfire. You're comparing 4 way SLI to 4 way crossfire. Just because 2 cards are on one PCB doesn't mean that they aren't in crossfire. Same rules apply. If there aren't any multi gpu profiles for the game half of a dual gpu card sits idle. Anything past 2 way SLI / Crossfire has minimal gains or support.
  4. Desktop for gaming but I do like my Chromebook for school. Probably won't buy a gaming laptop ever again though.
  5. None of the Intel or AMD GPUs have HDMI 2.0 to my knowledge. Nor do I know if a DP to HDMI would allow 4k at 60fps. My personal favorite TVs are the Sharp Quattrons. They have a yellow sub pixel for better yellows.
  6. 0% if downsampling from 4k to 1080p, 5-15% for any other resolution to 1080p.
  7. Lower settings like on screen units or view distance.
  8. Event ID 41 is one of those useless debugging code. You won't find anything relevant to fixing it by googling it. It means the machine didn't shut down properly. You can get it on a machine with no issues just by holding down the power button. Try @awesomes8wc3 post.
  9. It doesn't make it 4k. It makes a sharper 1080p image due to downscaling from a larger resolution. 1080p < DSR 4k to 1080p < 4k native.
  10. model numbers of both cpus and board?
  11. How about a different approach to figuring this out. What is your ambient room temperature? Your case and the other case fans you may have? Lastly do you have the radiator's airflow configured correctly? I.E. If it is at the top of the case it is exhaust, if it is in the front it is intake. Can use a very then piece of paper to see if the airflow is pushing or pulling.
  12. Nvidia is pushing its company that way. Which is why my next purchase most likely won't be an Nvidia product. I'm going to go ahead and use the most bias source I can because I can. http://international.download.nvidia.com/webassets/en_US/shared/images/products/shared/lineup-full.png Notice anything about the bars when comparing flagship to flagship each gen (the X80s, not the dual gpu solutions or the Titans)? Notice which gen had the smallest gap? Glad they focused on power consumption, it isn't like 4k screens are entering the affordable price range. Not exactly a time where I'm concerned about power consumption when I'm trying to figure out the best way to make a resolution hop. I respect your opinion, I'm a bit salty from reading yet another topic AMD vs Nvidia vs Intel topic if I come off blunt. Send me a PM if you want to post more grill pics, I'll find some tiny hybrid smart car pics to send for my opinion of Maxwell. Take it easy.
  13. I have a 690 (680 SLI). It has two 8 pin connectors. Eats around 400w underload and will almost hit the point of thermal throttling. It also heats my room in the winter. It consumes more power than the R9 290x. I do not care about power consumption.
  14. Power efficiency doesn't put frames on the screen. I couldn't give a damn about power consumption. If I wanted something to play games at less than 200w I'd get a laptop. Though the joke here is comparing last gen AMD to current gen Nvidia. Sadly the joke isn't very fun. I don't see you being around on the forums long.
  15. I would say yes. Much larger surface area to dissipate heat, more heat pipes, and larger fans. Though there isn't a lot of information to go off of. The first video posted would give you your best idea of possible difference.
  16. This is my best summary of R9 295x2 compared to a Titan X. The R9 295x2 isn't AMD's top end card. It is just two R9 290x cards. "How many R9 295x2s can be in Crossfire? 2 How many Titan Xs can be in SLI? 4 How well does a single R9 295x2 perform under worse case scenario for Crossfire support? 50% or equal to a R9 290x. How well does a single Titan X perform under worse case scenario for SLI support? 100% or equal to a Titan X. The fact the two R9 290x are on one PCB means nothing. They function the same or worse than two R9 290x cards. Other examples of how wrong it is to compare R9 290x crossfire to a Titan X. "Truck A hauls 800 lbs. Truck B hauls 1,200 lbs. Two of truck A can haul 1,600 lbs therefore truck A is better than truck B." "Basketball team A gets beat when up against team B and team C at the same time!" " Gun A takes 1 shot. Gun B takes 2 shots. But guys I'll just dual wield two of gun B! So what happens if dude has dual wield of Gun A...?" I have a GTX 690 (680 SLI) and I almost always type "(680 sli)" when posting about it because putting 2 cards on one pcb doesn't change that fact. If you ever actually buy a dual gpu card you come the the realization very quickly that you just have SLI / Crossfire once you get to see the glory of gpu monitoring software show you gpu usage. Psst it isn't one bar for usage it is two. It isn't one bar on temperature it is two. It isn't one fan speed graph it is two. It will also act like two cards when you watch one of them idle because the game doesn't support SLI / Crossfire. I hate the Titan line up. They have always been overpriced for their performance. At least with the original Titans they were a bastard card that could also be used for some workstation tasks thanks to their double precision. The Maxwell architecture doesn't do double precision well to be considered a bridge between Geforce and Quadro. It is now nothing more than an expensive single gpu gaming card from Nvidia. For all its flaws though it is still more wrong than it for comparing it against a dual gpu setup. "
  17. The technical answer is yes, the practical answer is no. You can not easily run an AMD GPU with a Nvidia PhysX card. Even if you do get it working it works on 4 games. The mod lost support 6 years ago. Here is the information for how to use hybrid physix. http://physxinfo.com/wiki/Hybrid_PhysX
  18. Well, if you can support the card so the PCB doesn't flex under the weight and if you can jury rig the fan to blow on the memory and vrms then you can do it. The internal fan slot in the 530 isn't close enough and all 3 GPUs I've opened up have had thermal pads to attach the mem to the main heatsink.
  19. If it is a new PC, some Windows updates need others installed first before they can install. So there is often multiple cycles you have to go through to get all of them out the way. The rainbows are probably unrelated but can try reinstalling the gpu drivers once Windows is done.
  20. If you have to buy a PhysX card then don't. Just save up and do SLI the performance increase will be much better. If you have an old card you can try it but often they aren't worth it. I ran a couple of tests using a 680 (half of my 690 disabled), a 690 (680 SLI), a 680 + GTX 260, and 690 + 260. While the 260 did actually improve frame rate slightly it wasn't worth it unless you already have an old card, and that old card isn't horrendously loud.
  21. Yup, photo of my GTX 260 sitting disassembled behind me.
  22. http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/67241-nzxt-kraken-x61-x41-liquid-cpu-coolers-review-7.html
  23. Nothing related to my post. Mounting the Evo would have the air blowing parallel to the PCB. Not cooling the other stuff on the PCB like the memory or VRMS like other solutions do. Which is why things like the g10 sport a fan on the rear to blow at those parts of the PCB.
×