Jump to content

majsta

Member
  • Posts

    45
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Not Telling

System

  • CPU
    Core i7 4770K
  • Motherboard
    ASUS Maximus Hero VI
  • RAM
    Kingston HyperX Beast 32GB@2400MHz
  • GPU
    Nvidia GTX770
  • Case
    Fractal Desing Define R4 Windowed
  • Storage
    Intel 530 SSD + 1TB WD Green + 2TB Seagate Baracuda
  • PSU
    Thermaltake SE 730W Bronze 80+
  • Display(s)
    ASUS MX279h
  • Cooling
    Stock Intel cooler
  • Keyboard
    Logitech K120
  • Mouse
    A4Tech x7 Oscar
  • Operating System
    Windows 8.1

Recent Profile Visitors

984 profile views

majsta's Achievements

  1. As far as I can see on 90% of the titles out there 2vs4GB doesn't mean a thing. The difference that you mention may be visible in frame drops from time to time, so I would expect it affects min values more than average.
  2. I just wanted to give you an update about the laptop. So the repair shop couldn't do anything, as expected. They had the same problem as me and the person taking a look into it was frustrated same as me and after 4 days, nothing happened. They are a great repair shop, but just, they couldn't do anything, so they returned the laptop and didn't charge me a thing so... Yeah. Just not working still, behaves the same. PS: I will try and replace the CMOS battery, but, this is a desperate attempt.
  3. Sorry for the late response, wasn't on the forum for a couple of days. I agree with him 100% in this case, and with you for that matter. In BF1 2GB is not enough on 1080p Ultra, case and point here. But still, when switching to 1080p on high setting the drop is less than on ultra, and I just want to remind you of my previous statement. Difference between High and Ultra is barely noticeable on 1080p if any. And I always come back to the power of the GPU, higher bandwidth on GTX 770, so it will output more FPS for sure, 2 of them will go over 60 FPS with ease, maybe even on ultra. Only problem that I might experience is occasional frame dips (like it is explained in some of the previous videos, but I wouldn't know until I test it) so that could ruin the experience, but without the test, we cant say for sure.
  4. Never heard of that BSOD to be honest, but it sounds like you have some driver problems. If the driver is not working it usually shows in the Device Manager as an exclamation mark. It might not be, still worth checking it out. He needs to clean his PC at least 2 times a year IMO, its not that hard, a can of compressed air and 1h of work, tell him not to be so lazy ore he will lose his PC. And talking about PSU stability. I had a power surge from the wall outlet last night. I swear my hart skipped a bit, I thought the PC was a goner. But ASUS MAXIMUS HERO VI power protection kicked in, I thank the god for not picking up a cheaper board. If not for overclocking it sure payed of now. No news on the laptop still, they asked for password on the laptop, I guess they are working on it but the problems start when they log into windows > They are in for a party of BSOD all around. Yes this thread went into a lot of different directions. Some heated discussions with myself included in most of them. I like talking to you guys, a lot of useful info to be heard.
  5. It's ok, We are having a meaningful discussion, just we cant agree what we are discussing about. its a joke, everyone chill. Its not personal preference, I like AMD more than Nvidia but at the moment when I bought GTX 770 the price vise comparison was the R9 280X. And the GTX 770 at that time was better so as much as I like AMD i went for Nvidia that time, simple.
  6. You are missing the point once again. I am not arguing about who has a bigger .... card. YES, Nvidia guys are dicks for pushing the SLI, I agree. AMD done away with the crossfire bridge/cable long time ago, and technology is similar, multygpu support on both sides, Nvidia can adopt the idea but they wont. I am sure you know how important it is to push your ideas and be the leader in IT and both you and I know they will never abandon SLI. We can go on if one or the other is better but that is not the point here. My point is and always has been from the beginning that I can get 50% performance benefit with SLI, and I know you will agree on this with me. Now, Lets get back to the basics of the problem. I am running 1080p at 60Hz monitor. I am NOT GOING TO UPGRADE TO 1440p any time soon, as I said, maybe never as I find this just enough for my daily use, and as I said, at the moment I don't care about the power draw and/or temps because the cards run just fine in my setup. So please don't pull up charts on 1440p gaming on the cards. I know You are bringing up a point about better hardware there and this is a valid comparison, but just remember that R7 200 and R9 200 series are refresh same as GTX 700 is on the GTX 600 models. And also Yes, AMD cards tend to get better as the time passes by, but, that is just due to pore driver support in the past at release (I hear that they have improved a lot in the last year). So these cards come into their place after few iterations on the drivers, but at release they suck . And I had AMD card previous to this system, don't get me started on drivers for Windows, and not to mention the drivers for Ubuntu. Getting a bit of topic here. To run games at 1080P 60FPS, two-way SLI on GTX 770 2GB OC MSI is more than enough. That is my opinion and I don't see any valid proof to prove otherwise except for few games which we can argue if they are valid. (Hitman -> all Nvidia cards run like crap compared to AMD, And Batman: <insert any title here > are just plain shit when it comes to implementation, we can agree both both on AMD and Nvidia cards) As I said DOOM and Tomb Raider, If I can't play them at MAX + AA I will drop the AA and or bring down the graphics slider, but I think I can, but without proof I can't say for sure. Working in IT sure relates to this topic. Most of the people tend to blabber (I am NOT SAYING you are one of them) about things they hear or read online and not really get into the core of the problem or technology. They don't do research on the topic, they don't have good understanding on how CPU/GPU or anything else works, they don't follow up on new technologies, and most important of all they didn't even try and code something, ever, not even "Hello World". And not to mention they don't know what polygons, textures, shaders, texels, are and how GPU uses them. And I was telling you this to address the statement you made on devs don't have to work on support for SLI/CrossFire. They do. We do, from my own experience, in our software we have support for DX9, DX10 and DX11 (i know it is not the same thing as SLI but bear with me here), but you have to have support on both sides if you want your product to be the best, and to work the best on any crap hardware (GTX 770), and you do, not just driver support on manufacturer side (I didn't forget the Nvidia and AMD devs, i just figured it is considered by default). So as one IT guy to another can we agree that I can get 50% performance from SLI, and that I can "tweak" the games to have 1080p 60FPS with VSync on? This has gone way out of the initial topic. Should have asked is GTX 770 SLI enough for 1080p 60FPS on High/MAX settings.
  7. How can you just say NO? Did you even look at the article, different games different scales, and the hardware is the same only the developer is different, ergo logical explanation is, it depends up to the game and the game developer? And how do you explain the benchmarks? Heaven and Valley, I tested them myself, near perfect scale, and on Valley I got more than double, and these benchmarks are NOT doing anything to "optimize for Nvidia" as you argue there. So there is nothing wrong with the NVIDIA driver support for the SLI for older cards. Dude, I get driver updates every 7 days for new titles. Or do you think that Nvidia after the release of the 900 series just stopped making drivers for older cards, because they didn't. For your other argument: so you want to tell me that the devs don't have to do anything to get the CrossFire to work? You think they just say use CrossFire and it works out of the box, no man. Both engine developers and game developer have to work really hard to utilize these performances, and they DO because they want their engine/game to perform the best on the worst hardware possible, because they will sell more and earn more. Or you just simply like AMD more than Nvidia and you want to say they make better cards, because they don't, for the last 5 years they really don't, none of their cards is better tier for tier, they simple aren't. Will it please you if I run XCOM 2, DOOM, Witcher 3 on single card and in SLI to see how it scales? If you don't believe the guys from 3dguru and what they wrote in the article. Or can we just take a look at GTA 5 in this video and see the side to side comparison of GTX 770 2GB MSI versions I have. And after this agree that we have 50% scale. PS: And I hate to bring this up dude, but I have a Masters Degree in Electrical and Computer science. I have more than 5 years of experience in Software Development, so computers, hardware and software, APIs, and even game dev in Unity are kinda my thing and I would argue I know a thing or two about "devs spend more time optimizing" things, trust me on this.
  8. OK. I watched the SLI and CrossFire video comparison. First of all support for SLI and/or CrossFire has to come from the game developer for the game to use the SLI and/or CrossFire. So if one brand or another is showing more or less performance I would argue that the game support for that brand is better. And some games don't support any of the above. I think we can all agree on that. And now for the important thing. GTX 980Ti is different architecture than GTX 770, so if first scales 50% it doesn't mean that the second one will also scale 50%. There is an undeniable difference here, and architectural difference if I may say so . So to see how the 770 would scale we would have to run it in single dual and triple SLI. And just it happens there is an article about this http://www.guru3d.com/articles_pages/geforce_gtx_770_sli_review,1.html (to be fair some older games are featured but they were AAA titles 2 years ago so give me some slack) and it comes to a same conclusion I am talking about here. At 1080p 60FPS its just fine and SLI scaling depends on the game.
  9. I didn't say it doesn't support it , I said this is a DX11 card. Support is one thing, a lot of cards support DX12 it is an API, but this card is made around DX11 and OpenGL 4.x and all of the drivers for it are DX11 and OpenGL based. The Keplar architecture can run DX12 games as well as Vuklan games but it can not utilize all of their features to the fullest. It does not have async compute (well none of the NVIDIA cards do, nothing similar to AMD ones) so all those parallel benefits and controlling the hardware at low level really don't do much for the 770. I would argue that this GTX 770 card would do worse on DX12 than on DX11 in Hitman. I'd like to see that test to be honest, but I don't have the game.
  10. Same thing happened to me when I went from win 8.1 to win 10. Works for a month and then starts BSOD random stuff. First I thought it was faulty hardware, but all tests showed OK. Then I dropped the clock on my CPU 100MHz down and it was fine and then a new SSD came along so I reinstalled it and got my +100MHz back (really it is not the SSD), probably a clean Windows install that got it working fine and the fresh drivers. No problems since knock on wood. Your PSU should be fine then. These are both great, no problem there I am sure of it. I don't know the conditions his PC is in, what case he is using, or where he keeps it, but I can give you some useful advice. Positive pressure inside the case is always good, but make sure you have some filters on the intake. A good disposable filter, if you dont have any filters on the case, are old woman stockings, the nice ones women wear, which you can use some duck-tape to geto mod the case, especially if the intake is on the bottom of the case, you wont see them and it will keep the dust away. Just make sure you change them once in a while . Also if the intake is from the bottom, don't keep the PC on the carped it is a hazard w8ting to happen, especially if the PSU is down and intakes from the bottom. A nice piece of wooden floor is all you need. And tell him to use a vacuum cleaner once in a while around the PC, it is nothing to be embarrassed about if he wants to keep his stuff nice and clean and dust free working longer. And for the laptop, I'll post here when I get some info from the repair shop . I hope they fix it dough, she wants to use my PC and we cant have that now can we .
  11. Well the used market here is expensive so 100 Euro for GTX 770 is a deal from a friend, it usually goes for like 140 to 160, and they get sold out in like 2-3 days. It's just like that around here. You can get it on the free market in my country but it is nowhere near the prices in USA, Germany or some other EU countries. I saw these cards go for 80 Euro on ebay in EU so go figure.
  12. Exactly my point in the comments. I am on 1080p and my monitor is 27" ASUS with 60Hz refresh rate so 1080p 60FPS is optimal for me. And one card is pulling its weight great in most titles I played. I had some problems with Witcher 3 for example with one card, 60FPS no problem with almost everything on max, but hair-works kills the performance. Don't know if it is the ram or the GPU being underpowered but it made me think of an upgrade for the first time. So this is the thread. Other games, no problems to be honest but hey for 100 euros i think this is the best deal at the moment.
  13. OK, I watched the video on 3GB vs 6GB GTX 1060. I agree with everything said. But you didn't read what I am writing to you. In 1080p games 1-2 FPS drop I just don't care as long it is above 60FPS and they are. And the majority of the games runs great at 1080p. So if the card drops frames in 1440p as it is shown in Assassin's creed on this video i don't care as I am on 1080p and the game will run great on 1080p, 1-2FPS drop maybe but it will be overall over 60FPS so I don't care and I can live with that (you have game performance on my video i posted), and Hitman test is on DX12, and my card is DX11 so this test isn't valid for me sorry< I would have to see the DX11 performance comparison. And even if it is, if I can get over 60 FPS and I believe I can with my setup it doesn't mater. New Tomb Raider and DOOM may not run on ultra with max AA set but I will drop down the AA settings and still have the same experience on 1080p 60FPS. My point being you can't see the difference unless you are looking for the difference, your eyes just don't see the difference from high to ultra, try it out yourself, you have the superior hardware and you can go both ultra and high to compare. Stutter may be an issue but as far as I can see in both of the video (talking about the vram comparison ones) no significant performance drop and no stutter at 1080p. Ill go and watch the SLI one now.
  14. Hay man, I am on Windows 10 now, I got the anniversary update and what not so, latest and greatest spy software from Microsoft here (tinfoil hat on). Jokes aside, one thing to note with OC, it goes from game to game, you have to be careful. Many of the games don't run well on OC cards, I know it sounds crazy but I found it out the hard way. In my experience any Cry Engine game may not work nicely with OC cards. I have been playing Armored Warfare, it uses the Cry Engine, I tried playing on my GTX 770 slight OC (the card was stable for sure, running benchmarks and stability tests before that to figure out how much I can OC) and it crashes the game every time. later I found out that the Cry Engine doesn't work well with OC cards (reddit post somewhere you can find it easy), and people have gone as far as under-clocking some of the models that are OC out of the box by the manufacturer just to play the game. That's just crazy IMO but it's a thing apparently. Dough the games crash, I didn't have any problems with the BSOD even when the card was OC (since I upgraded to two I don't run them OC). But I had my Win10 installed new one month ago when I upgraded my SSD so new windows on it not an upgrade. Maybe you should try a new install if you have it. If you upgraded from windows 7 and don't have windows 10 I don't know what you can do. Other thing to consider is the power supply. So my GTX 770 2GB uses i believe 22 Amps on 12V+ (don't quote me on that) so for SLI you would need AT LEAST 44+ Amps on 12V+ and + anything else that is on the same rail. So Watage on the power supply isn't the only thing to consider, you have to be careful with the Amps as well. I have 730W 80+ Bronze with 55Amps on 12V+ rail and it is just barely enough. But it is stable. I hope you get it up and running. I feel your pain man, trust me, as I have been trying to fix my GFs laptop last weekend, and just it ruined it for me. I love doing this stuff, but just sometimes... I tried Win 10, Win 8.1, Win 7 x64, Win 7 x86 and just no, they all BSOD after the clean install and every time with different message. RAM is working, mem tested, Hard Drive is sentinel 100% with 100% working, so none of the removable hardware is the problem. So I took the laptop apart, cleaned it up from dust, removed the CMOS battery and reinstalled it, replaced the thermal paste with noctua one no shitty thermal paste for my GF , and still nothing . And the crappy laptop cant even do a BIOS update, god it is annoying. This is like Linus spooky hardware going on here, BSOD for no apparent reason. At the end after 3 days of troubleshooting the crap out of everything I just gave up and sent it to computer repair shop, but to be honest I don't think they will fix it, if they do, I will give them congrats man.
  15. Look man I don't want to argue with you. I know that I cant play all of the AAA games on ultra with max AA above 60 FPS and hairworks turned on or what not, but the fact is that there is 90% of the games that handle just fine on 2GB models and you don't really notice the difference between ultra textures and very high in most of the games. And if I cant run Rise of the Tomb Raider on Ultra, then Ill just drop back to high or turn down the AA or something to get to does 60FPS without stutter. I am fine with that and you should be to. My point is that VRAM doesn't affect the performance as much as you think. If you have watched the video you may have noticed for example that: - Assassin's Creed: Syndicate allocates 2.7 GB and gets 1 FPS drop when using the 2GB card, - Middle-earth: Shadow Of Mordor allocates 2.6 GB on Vary High and gets 1 frame drop when using 2GB card, on Ultra it allocates 3.9 GB and gets about 3 FPS drop when using the 2GB card, - Star Wars Battlefront allocates 2.9 GB and the 2GB card pulls away and scores 2 more FPS than the 4GB, - Far Cry: Primal allocates 3.9 GB and there is 1 frame drop, - Batman: Arkham Knight allocates 3.4 GB and gets 2 FPS drop on 2GB card. The only game which benefits from more ram in all of the tests is DOOM which allocates 3.1 GB and gets almost the double the performance on 4 GB card. If the amount of VRAM was an issue it would show up in all of the games, but it didn't. 1-2 FPS drop is marginal, and in some cases as Star Wars it is just funny how much the VRAM doesn't matter. The main feature in FPS count isn't the data allocated, its the power of the GPU, GPU speed and VRAM speed. Texture sizes are the same no mater what GPU you are rocking. And in more games than not it has been shown in the video that it doesn't mater (unless you think 1-2 FPS maters, than we should end the conversation now because we just can't agree on this one) And remember that the GTX 770 has more bandwidth than RX 460 and overall better GPU with double the pixel rate, triple the texel rate, double the bandwidth, and more than half of the compute performance, so the average and minimum frames should be at least 50% better with a single card, and with SLI I expect at least double (and I considered your pessimistic "SLI doesn't scale in games" here) than what is shown in the video. Hey, even if the game doesn't support SLI the second card will be used as a physx engine so more power that way. And I am more than happy with these performances, as i feel that 1-2 FPS drop across the board is nothing. Except for DOOM, but if RX 460 can run the game at 50 FPS avg I wont have any problems getting up to 60 FPS avg. I'll test DOOM at max settings and ill get back to you if you want. Sorry for the long reply, I really do respect your opinion, but I want to explain my point of view to you.
×