Jump to content

SLI or save up for new GPU

A GTX 980ti might be cheaper in your area, it has very similar performance to a 1070 so that might be a good option.

System

  • CPU
    I7 6700K Overclocked to 4.6 GHz at 1.33v
  • Motherboard
    Asus Z270 PRIME - A
  • RAM
    GSKILL RIPJAWS V DDR4 16GB 3000MHZ
  • GPU
    MSI GTX 1070 GAMING X 8G overclocked to 2063 MHZ and 8900 MHZ memory clock
  • Case
    NZXT S340 RED
  • Storage
    WD 1TB BLUE AND SAMSUNG EVO 250GB SSD
  • PSU
    EVGA 650W GQ
  • Display(s)
    LG 25UM58-P ULTRAWIDE and LG 29UM58-P 29 ULTRAWIDE
  • Cooling
    CORSAIR H100I GTX
  • Keyboard
    CORSAIR K70
  • Mouse
    LOGITECH G502 PROTEUS SPECTRUM
  • Operating System
    Windows 10 PRO
  •  
  •  
Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Morgan MLGman said:

This isn't my point. To run the game at 1080p at the settings that really push 4GB of VRAM to its limits you need a fast GPU. My R9 290X can max out any game at 1080p, 60FPS basically, but barely. Now imagine that a 460 is like 130% slower than my 290X...

But that's just it. At 1080p, 60FPS, v-sync on that I am running there isn't a game that fills up the 4GB of VRAM, even with maxed out settings. They barely do fill up 2GB of ram. And even when they do fill up to 2GB you don't get the FPS drop, that is why I posted the video, and that is the point of the video. And my GTX 770 is way better card than the RX 460, even dough it is older, it is still just simply better. 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, laushik said:

A GTX 980ti might be cheaper in your area, it has very similar performance to a 1070 so that might be a good option.

Already talked about it. People in my area don't sell their 980Ti-s to upgrade to 1080-s or to Titan XP-s, they just don't have the money. Simple as that. And even if somebody does that the price of a used 980Ti is like 300+ EURO-s even more. (and the average paycheck in my country is ~360 EURO-s, so yeah people have to work for one month and not eat to have that card which is just sad :()

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, majsta said:

But that's just it. At 1080p, 60FPS, v-sync on that I am running there isn't a game that fills up the 4GB of VRAM, even with maxed out settings. They barely do fill up 2GB of ram. And even when they do fill up to 2GB you don't get the FPS drop, that is why I posted the video, and that is the point of the video. And my GTX 770 is way better card than the RX 460, even dough it is older, it is still just simply better. 

My almost every game pushes over 2GB of VRAM, some push almost full 4GB, some require more than 4GB.

 

GTX 770 isn't a fast card tbh, GTX 770 SLI is pretty much effectively as fast as my single 290X in games. (290X is around 50% faster than a single one, SLI scaling is just around 50% so that's an accurate assumption).

 

2GB is not enough to max out games at 1080p. That's a fact that everyone knows about. You know... Just because a game says it uses full 2GB of your VRAM, doesn't mean it really does. What MSI Afterburner and the likes of it show you is ADDRESSED VRAM, not actually used one. You'd need a game that says it wants around 3GB to actually feel stuttering. And to say it before: there's plenty of games like that. The card can hold 2GB of data in its memory, after that it has to start switching data out of the VRAM and pull data out of the system memory which is MUCH, MUCH slower and will impact your performance in a very noticeable degree. Games now even though they claim they use 2GB, the only ADDRESS that much. Run Rise of the Tomb Raider with Ultra Texture setting (which claims to take around 5GB+ of VRAM at 1080p) and tell me about no stuttering again.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, System Error Message said:

You should upgrade. 2GB of vram is your limitting factor.

 

If you ever plan to go with 2 GPUs in the future than you will need a GPU with double vram. That helps to scale with performance in the future as requirements increase if you ever plan on dual GPU. Dont look at the dual GPU support as a limiting factor as with the same GPUs theres driver support compared to app support. Some games may not scale but most games do.

I already went for two GTX 770-s, it is done now. If I upgrade it will be in like a year from now when the prices drop to reasonable level. I know what you all guys mean by being future proof, and the fear of 2GB of VRAM not being enough for future games, but I cant give 550+ euros for GTX 1070 now I just cant, sorry if anyone gets disappointed by my decision but this seams logical to me. And as you said, games tend to support SLI, apps don't, and the only app I am using is Adobe Premier Pro an they are pretty good with their support. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Morgan MLGman said:

My almost every game pushes over 2GB of VRAM, some push almost full 4GB, some require more than 4GB.

 

GTX 770 isn't a fast card tbh, GTX 770 SLI is pretty much effectively as fast as my single 290X in games. (290X is around 50% faster than a single one, SLI scaling is just around 50% so that's an accurate assumption).

 

2GB is not enough to max out games at 1080p. That's a fact that everyone knows about. You know... Just because a game says it uses full 2GB of your VRAM, doesn't mean it really does. What MSI Afterburner and the likes of it show you is ADDRESSED VRAM, not actually used one. You'd need a game that says it wants around 3GB to actually feel stuttering. And to say it before: there's plenty of games like that. The card can hold 2GB of data in its memory, after that it has to start switching data out of the VRAM and pull data out of the system memory which is MUCH, MUCH slower and will impact your performance in a very noticeable degree. Games now even though they claim they use 2GB, the only ADDRESS that much. Run Rise of the Tomb Raider with Ultra Texture setting (which claims to take around 5GB+ of VRAM at 1080p) and tell me about no stuttering again.

Look man I don't want to argue with you. I know that I cant play all of the AAA games on ultra with max AA above 60 FPS and hairworks turned on or what not, but the fact is that there is 90% of the games that handle just fine on 2GB models and you don't really notice the difference between ultra textures and very high in most of the games. And if I cant run Rise of the Tomb Raider on Ultra, then Ill just drop back to high or turn down the AA or something to get to does 60FPS without stutter. I am fine with that and you should be to. 

 

My point is that VRAM doesn't affect the performance as much as you think. If you have watched the video you may have noticed for example that:

- Assassin's Creed: Syndicate allocates 2.7 GB and gets 1 FPS drop when using the 2GB card,

- Middle-earth: Shadow Of Mordor allocates 2.6 GB on Vary High and gets 1 frame drop when using 2GB card, on Ultra it allocates 3.9 GB and gets about 3 FPS drop when using the 2GB card,

- Star Wars Battlefront allocates 2.9 GB and the 2GB card pulls away and scores 2 more FPS than the 4GB,

- Far Cry: Primal allocates 3.9 GB and there is 1 frame drop,

- Batman: Arkham Knight allocates 3.4 GB and gets 2 FPS drop on 2GB card. 
The only game which benefits from more ram in all of the tests is DOOM which allocates 3.1 GB and gets almost the double the performance on 4 GB card. If the amount of VRAM was an issue it would show up in all of the games, but it didn't. 1-2 FPS drop is marginal, and in some cases as Star Wars it is just funny how much the VRAM doesn't matter.

The main feature in FPS count isn't the data allocated, its the power of the GPU, GPU speed and VRAM speed. Texture sizes are the same no mater what GPU you are rocking. And in more games than not it has been shown in the video that it doesn't mater (unless you think 1-2 FPS maters, than we should end the conversation now because we just can't agree on this one)

 

And remember that the GTX 770 has more bandwidth than RX 460 and overall better GPU with double the pixel rate, triple the texel rate, double the bandwidth, and more than half of the compute performance, so the average and minimum frames should be at least 50% better with a single card, and with SLI I expect at least double (and I considered your pessimistic "SLI doesn't scale in games" here) than what is shown in the video. Hey, even if the game doesn't support SLI the second card will be used as a physx engine so more power that way. 

And I am more than happy with these performances, as i feel that 1-2 FPS drop across the board is nothing. Except for DOOM, but if RX 460 can run the game at 50 FPS avg I wont have any problems getting up to 60 FPS avg.
I'll test DOOM at max settings and ill get back to you if you want. 

 

Sorry for the long reply, I really do respect your opinion, but I want to explain my point of view to you. 

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, majsta said:

-snip-

Great to hear it's stable for you. You run Windows 8.1 right?

 

Yeah, my friend's PC build with 770 4GB in SLI has been driving me crazy. It randomly crashes on some games, or bluescreens. What drives me nuts is that I turned all of the OCs back to stock speeds, and ran Intel Burn Test, AIDA64, and Furmark for half a day and the system handles that no issues.

 

I can't tell if it's a Windows 10 issue or not, as his PC as rock solid on Windows 7. I feel bad for my friend because his build back in the day cost a lot ($400 per 770...this was back when the 770s were new). I myself have had issues with the most recent update with Windows 10 (Also the same random system lockups, my network drive refuses to reconnect on startup...). I only play a handful of RPG games myself, my friend though plays a lot of games from AAA to rocket league.

 

Happy to hear you got a working system for a good deal though. I myself got tired of my old FirePro V7900 (It struggled at 14-24 fps in the above RPG games which were PS3 / PS4 ports), and went overkill and got a used 980 Ti for $350 (The 1070 at the time was crazy overpriced...like $450+ because they were out of stock).

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, scottyseng said:

Great to hear it's stable for you. You run Windows 8.1 right?

 

Yeah, my friend's PC build with 770 4GB in SLI has been driving me crazy. It randomly crashes on some games, or bluescreens. What drives me nuts is that I turned all of the OCs back to stock speeds, and ran Intel Burn Test, AIDA64, and Furmark for half a day and the system handles that no issues.

 

I can't tell if it's a Windows 10 issue or not, as his PC as rock solid on Windows 7. I feel bad for my friend because his build back in the day cost a lot ($400 per 770...this was back when the 770s were new). I myself have had issues with the most recent update with Windows 10 (Also the same random system lockups, my network drive refuses to reconnect on startup...). I only play a handful of RPG games myself, my friend though plays a lot of games from AAA to rocket league.

 

Happy to hear you got a working system for a good deal though. I myself got tired of my old FirePro V7900 (It struggled at 14-24 fps in the above RPG games which were PS3 / PS4 ports), and went overkill and got a used 980 Ti for $350 (The 1070 at the time was crazy overpriced...like $450+ because they were out of stock).

Hay man,

I am on Windows 10 now, I got the anniversary update and what not so, latest and greatest spy software from Microsoft here (tinfoil hat on).
Jokes aside, one thing to note with OC, it goes from game to game, you have to be careful. Many of the games don't run well on OC cards, I know it sounds crazy but I found it out the hard way. 

In my experience any Cry Engine game may not work nicely with OC cards. I have been playing Armored Warfare, it uses the Cry Engine, I tried playing on my GTX 770 slight OC (the card was stable for sure, running benchmarks and stability tests before that to figure out how much I can OC) and it crashes the game every time. later I found out that the Cry Engine doesn't work well with OC cards (reddit post somewhere you can find it easy), and people have gone as far as under-clocking some of the models that are OC out of the box by the manufacturer just to play the game. That's just crazy IMO but it's a thing apparently. 

Dough the games crash, I didn't have any problems with the BSOD even when the card was OC (since I upgraded to two I don't run them OC). But I had my Win10 installed new one month ago when I upgraded my SSD so new windows on it not an upgrade. Maybe you should try a new install if you have it. If you upgraded from windows 7 and don't have windows 10 :( I don't know what you can do. 

Other thing to consider is the power supply. So my GTX 770 2GB uses i believe 22 Amps on 12V+ (don't quote me on that) so for SLI you would need AT LEAST 44+ Amps on 12V+ and + anything else that is on the same rail. So Watage on the power supply isn't the only thing to consider, you have to be careful with the Amps as well. I have 730W 80+ Bronze with 55Amps on 12V+ rail and it is just barely enough. But it is stable. 

I hope you get it up and running. I feel your pain man, trust me, as I have been trying to fix my GFs laptop last weekend, and just it ruined it for me. I love doing this stuff, but just sometimes... I tried Win 10, Win 8.1, Win 7 x64, Win 7 x86 and just no, they all BSOD after the clean install and every time with different message. RAM is working, mem tested, Hard Drive is sentinel 100% with 100% working, so none of the removable hardware is the problem. So I took the laptop apart, cleaned it up from dust, removed the CMOS battery and reinstalled it, replaced the thermal paste with noctua one no shitty thermal paste for my GF :), and still nothing :(. And the crappy laptop cant even do a BIOS update, god it is annoying. This is like Linus spooky hardware going on here, BSOD for no apparent reason. 
At the end after 3 days of troubleshooting the crap out of everything I just gave up and sent it to computer repair shop, but to be honest I don't think they will fix it, if they do, I will give them congrats man. 

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, majsta said:

Look man I don't want to argue with you. I know that I cant play all of the AAA games on ultra with max AA above 60 FPS and hairworks turned on or what not, but the fact is that there is 90% of the games that handle just fine on 2GB models and you don't really notice the difference between ultra textures and very high in most of the games. And if I cant run Rise of the Tomb Raider on Ultra, then Ill just drop back to high or turn down the AA or something to get to does 60FPS without stutter. I am fine with that and you should be to. 

 

My point is that VRAM doesn't affect the performance as much as you think. If you have watched the video you may have noticed for example that:

- Assassin's Creed: Syndicate allocates 2.7 GB and gets 1 FPS drop when using the 2GB card,

- Middle-earth: Shadow Of Mordor allocates 2.6 GB on Vary High and gets 1 frame drop when using 2GB card, on Ultra it allocates 3.9 GB and gets about 3 FPS drop when using the 2GB card,

- Star Wars Battlefront allocates 2.9 GB and the 2GB card pulls away and scores 2 more FPS than the 4GB,

- Far Cry: Primal allocates 3.9 GB and there is 1 frame drop,

- Batman: Arkham Knight allocates 3.4 GB and gets 2 FPS drop on 2GB card. 
The only game which benefits from more ram in all of the tests is DOOM which allocates 3.1 GB and gets almost the double the performance on 4 GB card. If the amount of VRAM was an issue it would show up in all of the games, but it didn't. 1-2 FPS drop is marginal, and in some cases as Star Wars it is just funny how much the VRAM doesn't matter.

The main feature in FPS count isn't the data allocated, its the power of the GPU, GPU speed and VRAM speed. Texture sizes are the same no mater what GPU you are rocking. And in more games than not it has been shown in the video that it doesn't mater (unless you think 1-2 FPS maters, than we should end the conversation now because we just can't agree on this one)

 

And remember that the GTX 770 has more bandwidth than RX 460 and overall better GPU with double the pixel rate, triple the texel rate, double the bandwidth, and more than half of the compute performance, so the average and minimum frames should be at least 50% better with a single card, and with SLI I expect at least double (and I considered your pessimistic "SLI doesn't scale in games" here) than what is shown in the video. Hey, even if the game doesn't support SLI the second card will be used as a physx engine so more power that way. 

And I am more than happy with these performances, as i feel that 1-2 FPS drop across the board is nothing. Except for DOOM, but if RX 460 can run the game at 50 FPS avg I wont have any problems getting up to 60 FPS avg.
I'll test DOOM at max settings and ill get back to you if you want. 

 

Sorry for the long reply, I really do respect your opinion, but I want to explain my point of view to you. 

As for the SLI talk, just watch this video:

 

It's the best tech channel in terms of reliable benchmarking and comparisons.

They test games with 980Ti's in SLI vs Fury X's in Crossfire, the scaling is around 50% average. It's much worse than Crossfire in that regard, so you won't get double the performance in anything but 3DMark. Overall, Crossfire seems better as the scaling on SLI is just sad. When you look at relative number of FPS from one card and then after you add the second one it's just not worth even considering...

 

As for VRAM, I'd recommend you watch this:

And besides average 5% performance loss with the 3GB version, not the 2GB card that's got 1/3rd less VRAM, particularly look at Hitman benchmark in the video and see what really happens when VRAM gets full and needs to switch data and pull it out of the regular system memory which is much slower

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

On 16/09/2016 at 9:42 AM, majsta said:

OK guys so this is my dilemma, 

   I have a MSI GTX 770 2GB GAMING, and the card is great, but I have been thinking of upgrading lately and I settled for a GTX 1070. I am skipping a generation as I want the card to last me for a couple of years. But this is a a problem i my country as the pricing on the GTX 1070 is so high, the cheapest one is Gigabyte Mini GTX 1070 ~ 500+ Euro, and the ASUS Strix the one i want goes up to even 550, which is just insane. I have the money, don't get me wrong, but it is like half of my paycheck just for the GPU, and I just cant force myself into giving so much money, I am thinking about it but just... argh. Buying on ebay is not an option for me as only few sellers are shipping or the shipping costs are like 50 Euro, and the customs is 20% so the price is more than I would pay here. 
   The other option is to SLI my GTX 770 which i have a deal for 100 Euro used from a friend of mine, the same model MSI GTX 770 2GB GAMING, actually the both cards have been running in similar setups for same time, and it has not been worn out. And I am sure I can sell this GPU in a year for the same money, 100 Euro is still cheap for this GPU where I live. 
   So the question finally, would you guys go for the SLI or save up for the new GPU maybe the price goes down, (but that is highly unlikely)?

The price of the 1070 may come down to 400 euros,  but if you want a nice upgrade now I think sli is a good option.  I have played a few games on an Asus Mars 760x2 which is slightly slower. It's perfectly good in most games at 1080p, yhe only thing it struggled with was metro last light on max settings.  When I had my fury x I did a bit of comparison between the two cards and at 1080 the 2gb card was fine,  it wasn't until 1440p and 4k where the fury x really started to pull ahead because of the extra vram. 

 

 

A decent optimised game will use all the vram on any card weather it has 8gb or 2gb or 1gb. The 8gb just can store more that's all. Just because it uses 6/8gb on an 8gb card doesn't mean it won't work on a 2gb card. 

Rig Specs:

AMD Threadripper 5990WX@4.8Ghz

Asus Zenith III Extreme

Asrock OC Formula 7970XTX Quadfire

G.Skill Ripheartout X OC 7000Mhz C28 DDR5 4X16GB  

Super Flower Power Leadex 2000W Psu's X2

Harrynowl's 775/771 OC and mod guide: http://linustechtips.com/main/topic/232325-lga775-core2duo-core2quad-overclocking-guide/ http://linustechtips.com/main/topic/365998-mod-lga771-to-lga775-cpu-modification-tutorial/

ProKoN haswell/DC OC guide: http://linustechtips.com/main/topic/41234-intel-haswell-4670k-4770k-overclocking-guide/

 

"desperate for just a bit more money to watercool, the titan x would be thankful" Carter -2016

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Morgan MLGman said:

-snip-

OK, I watched the video on 3GB vs 6GB GTX 1060.


I agree with everything said. But you didn't read what I am writing to you. In 1080p games 1-2 FPS drop I just don't care as long it is above 60FPS and they are. And the majority of the games runs great at 1080p. So if the card drops frames in 1440p as it is shown in Assassin's creed on this video i don't care as I am on 1080p and the game will run great on 1080p, 1-2FPS drop maybe but it will be overall over 60FPS so I don't care and I can live with that (you have game performance on my video i posted), and Hitman test is on DX12, and my card is DX11 so this test isn't valid for me sorry< I would have to see the DX11 performance comparison. And even if it is, if I can get over 60 FPS and I believe I can with my setup it doesn't mater.
New Tomb Raider and DOOM may not run on ultra with max AA set but I will drop down the AA settings and still have the same experience on 1080p 60FPS. My point being you can't see the difference unless you are looking for the difference, your eyes just don't see the difference from high to ultra, try it out yourself, you have the superior hardware and you can go both ultra and high to compare. 
Stutter may be an issue but as far as I can see in both of the video (talking about the vram comparison ones) no significant performance drop and no stutter at 1080p. 

Ill go and watch the SLI one now. 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Jumper118 said:

The price of the 1070 may come down to 400 euros,  but if you want a nice upgrade now I think sli is a good option.  I have played a few games on an Asus Mars 760x2 which is slightly slower. It's perfectly good in most games at 1080p, yhe only thing it struggled with was metro last light on max settings.  When I had my fury x I did a bit of comparison between the two cards and at 1080 the 2gb card was fine,  it wasn't until 1440p and 4k where the fury x really started to pull ahead because of the extra vram. 

 

 

A decent optimised game will use all the vram on any card weather it has 8gb or 2gb or 1gb. The 8gb just can store more that's all. Just because it uses 6/8gb on an 8gb card doesn't mean it won't work on a 2gb card. 

Exactly my point in the comments.

I am on 1080p and my monitor is 27" ASUS with 60Hz refresh rate so 1080p 60FPS is optimal for me. And one card is pulling its weight great in most titles I played. I had some problems with Witcher 3 for example with one card, 60FPS no problem with almost everything on max, but hair-works kills the performance. Don't know if it is the ram or the GPU being underpowered but it made me think of an upgrade for the first time. So this is the thread. Other games, no problems to be honest but hey for 100 euros i think this is the best deal at the moment. 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, majsta said:

Exactly my point in the comments.

I am on 1080p and my monitor is 27" ASUS with 60Hz refresh rate so 1080p 60FPS is optimal for me. And one card is pulling its weight great in most titles I played. I had some problems with Witcher 3 for example with one card, 60FPS no problem with almost everything on max, but hair-works kills the performance. Don't know if it is the ram or the GPU being underpowered but it made me think of an upgrade for the first time. So this is the thread. Other games, no problems to be honest but hey for 100 euros i think this is the best deal at the moment. 

I would agree. And you loose a lot less money and you can still save for a better card while using sli cards.  

Rig Specs:

AMD Threadripper 5990WX@4.8Ghz

Asus Zenith III Extreme

Asrock OC Formula 7970XTX Quadfire

G.Skill Ripheartout X OC 7000Mhz C28 DDR5 4X16GB  

Super Flower Power Leadex 2000W Psu's X2

Harrynowl's 775/771 OC and mod guide: http://linustechtips.com/main/topic/232325-lga775-core2duo-core2quad-overclocking-guide/ http://linustechtips.com/main/topic/365998-mod-lga771-to-lga775-cpu-modification-tutorial/

ProKoN haswell/DC OC guide: http://linustechtips.com/main/topic/41234-intel-haswell-4670k-4770k-overclocking-guide/

 

"desperate for just a bit more money to watercool, the titan x would be thankful" Carter -2016

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, majsta said:

-snip-

Hmm, interesting, I think first though, I'm going to have him clean the PC since I kind of want to make sure dust build up is the reason (Last time I did a drive install for him...wow...so much dust inside...). Then I'll remove all overclocked just to see.

 

Yeah, I had to clean install...I originally had Windows 7, did the upgrade, worked fine for about a month...but then got into a BSOD loop. I was forced to clean install. My friend's PC I clean installed as well.

 

His PSU and my PSU should be plenty. He has the Cooler Master V850, and I have the Corsair AX860i. I went a bit overboard with the PSU just for some headroom.

 

Yeah, I agree, sometimes computer tech can be very annoying. I might clean install again in the future just to see. I usually do content creation so I don't see BSODs as much as my friend (who games much more). Curious to see what is wrong with that laptop though.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Jumper118 said:

I would agree. And you loose a lot less money and you can still save for a better card while using sli cards.  

Well the used market here is expensive so 100 Euro for GTX 770 is a deal from a friend, it usually goes for like 140 to 160, and they get sold out in like 2-3 days. It's just like that around here. You can get it on the free market in my country but it is nowhere near the prices in USA, Germany or some other EU countries. I saw these cards go for 80 Euro on ebay in EU so go figure.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, majsta said:

Well the used market here is expensive so 100 Euro for GTX 770 is a deal from a friend, it usually goes for like 140 to 160, and they get sold out in like 2-3 days. It's just like that around here. You can get it on the free market in my country but it is nowhere near the prices in USA, Germany or some other EU countries. I saw these cards go for 80 Euro on ebay in EU so go figure.

I got a 680 lightning for £72 from ebay UK :)  that was a good deal :D

Rig Specs:

AMD Threadripper 5990WX@4.8Ghz

Asus Zenith III Extreme

Asrock OC Formula 7970XTX Quadfire

G.Skill Ripheartout X OC 7000Mhz C28 DDR5 4X16GB  

Super Flower Power Leadex 2000W Psu's X2

Harrynowl's 775/771 OC and mod guide: http://linustechtips.com/main/topic/232325-lga775-core2duo-core2quad-overclocking-guide/ http://linustechtips.com/main/topic/365998-mod-lga771-to-lga775-cpu-modification-tutorial/

ProKoN haswell/DC OC guide: http://linustechtips.com/main/topic/41234-intel-haswell-4670k-4770k-overclocking-guide/

 

"desperate for just a bit more money to watercool, the titan x would be thankful" Carter -2016

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, majsta said:

Hitman test is on DX12, and my card is DX11 so this test isn't valid for me sorry< I would have to see the DX11 performance comparison.

DX12 doesn't change anything in terms of VRAM, it could just as well be a DX11 game :) And 770 supports DX12 as Kepler does support DX12.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, scottyseng said:

-snip-

Same thing happened to me when I went from win 8.1 to win 10. Works for a month and then starts BSOD random stuff. First I thought it was faulty hardware, but all tests showed OK. Then I dropped the clock on my CPU 100MHz down and it was fine and then a new SSD came along so I reinstalled it and got my +100MHz back (really it is not the SSD), probably a clean Windows install that got it working fine and the fresh drivers. No problems since knock on wood. 

 

Your PSU should be fine then. These are both great, no problem there I am sure of it. 

I don't know the conditions his PC is in, what case he is using, or where he keeps it, but I can give you some useful advice.
Positive pressure inside the case is always good, but make sure you have some filters on the intake. A good disposable filter, if you dont have any filters on the case, are old woman stockings, the nice ones women wear, which you can use some duck-tape to geto mod the case, especially if the intake is on the bottom of the case, you wont see them and it will keep the dust away. Just make sure you change them once in a while ;).

Also if the intake is from the bottom, don't keep the PC on the carped it is a hazard w8ting to happen, especially if the PSU is down and intakes from the bottom. A nice piece of wooden floor is all you need.

And tell him to use a vacuum cleaner once in a while around the PC, it is nothing to be embarrassed about if he wants to keep his stuff nice and clean and dust free working longer.
 

And for the laptop, I'll post here when I get some info from the repair shop :). I hope they fix it dough, she wants to use my PC :( and we cant have that now can we :D

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Morgan MLGman said:

DX12 doesn't change anything in terms of VRAM, it could just as well be a DX11 game :) And 770 supports DX12 as Kepler does support DX12.

I didn't say it doesn't support it :), I said this is a DX11 card. 
Support is one thing, a lot of cards support DX12 it is an API, but this card is made around DX11 and OpenGL 4.x and all of the drivers for it are DX11 and OpenGL based. The Keplar architecture can run DX12 games as well as Vuklan games but it can not utilize all of their features to the fullest. It does not have async compute (well none of the NVIDIA cards do, nothing similar to AMD ones) so all those parallel benefits and controlling the hardware at low level really don't do much for the 770. I would argue that this GTX 770 card would do worse on DX12 than on DX11 in Hitman. I'd like to see that test to be honest, but I don't have the game. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Morgan MLGman said:

-snip-

OK. I watched the SLI and CrossFire video comparison. 
First of all support for SLI and/or CrossFire has to come from the game developer for the game to use the SLI and/or CrossFire. So if one brand or another is showing more or less performance I would argue that the game support for that brand is better. And some games don't support any of the above. I think we can all agree on that. 

And now for the important thing. GTX 980Ti is different architecture than GTX 770, so if first scales 50% it doesn't mean that the second one will also scale 50%. There is an undeniable difference here, and architectural difference if I may say so :). So to see how the 770 would scale we would have to run it in single dual and triple SLI. And just it happens there is an article about this http://www.guru3d.com/articles_pages/geforce_gtx_770_sli_review,1.html (to be fair some older games are featured but they were AAA titles 2 years ago so give me some slack) and it comes to a same conclusion I am talking about here.
At 1080p 60FPS its just fine and SLI scaling depends on the game. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, majsta said:

OK. I watched the SLI and CrossFire video comparison. 
First of all support for SLI and/or CrossFire has to come from the game developer for the game to use the SLI and/or CrossFire. So if one brand or another is showing more or less performance I would argue that the game support for that brand is better. And some games don't support any of the above. I think we can all agree on that. 

And now for the important thing. GTX 980Ti is different architecture than GTX 770, so if first scales 50% it doesn't mean that the second one will also scale 50%. There is an undeniable difference here, and architectural difference if I may say so :). So to see how the 770 would scale we would have to run it in single dual and triple SLI. And just it happens there is an article about this http://www.guru3d.com/articles_pages/geforce_gtx_770_sli_review,1.html (to be fair some older games are featured but they were AAA titles 2 years ago so give me some slack) and it comes to a same conclusion I am talking about here.
At 1080p 60FPS its just fine and SLI scaling depends on the game. 

No, scaling is not related to the devs, Crossfire just scales better. If anything, devs spend more time optimizing for SLI as Nvidia sponsors many more games.

 

If anything, the 770 would scale even worse than 980Ti as Kepler is much older and less-widely supported nowadays. So 50% average game scaling is an optimistic prediction for your 770's

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Morgan MLGman said:

No, scaling is not related to the devs, Crossfire just scales better. If anything, devs spend more time optimizing for SLI as Nvidia sponsors many more games.

 

If anything, the 770 would scale even worse than 980Ti as Kepler is much older and less-widely supported nowadays. So 50% average game scaling is an optimistic prediction for your 770's

How can you just say NO? Did you even look at the article, different games different scales, and the hardware is the same only the developer is different, ergo logical explanation is, it depends up to the game and the game developer? And how do you explain the benchmarks? Heaven and Valley, I tested them myself, near perfect scale, and on Valley I got more than double, and these benchmarks are NOT doing anything to "optimize for Nvidia" as you argue there. So there is nothing wrong with the NVIDIA driver support for the SLI for older cards. Dude, I get driver updates every 7 days for new titles. Or do you think that Nvidia after the release of the 900 series just stopped making drivers for older cards, because they didn't.

 

For your other argument: so you want to tell me that the devs don't have to do anything to get the CrossFire to work? You think they just say use CrossFire and it works out of the box, no man. Both engine developers and game developer have to work really hard to utilize these performances, and they DO because they want their engine/game to perform the best on the worst hardware possible, because they will sell more and earn more. 
Or you just simply like AMD more than Nvidia and you want to say they make better cards, because they don't, for the last 5 years they really don't, none of their cards is better tier for tier, they simple aren't. 


Will it please you if I run XCOM 2, DOOM, Witcher 3 on single card and in SLI to see how it scales? If you don't believe the guys from 3dguru and what they wrote in the article. Or can we just take a look at GTA 5 in this video and see the side to side comparison of GTX 770 2GB MSI versions I have.

And after this agree that we have 50% scale.

 

PS: And I hate to bring this up dude, but I have a Masters Degree in Electrical and Computer science. I have more than 5 years of experience in Software Development, so computers, hardware and software, APIs, and even game dev in Unity are kinda my thing and I would argue I know a thing or two about "devs spend more time optimizing" things, trust me on this. 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, majsta said:

How can you just say NO? Did you even look at the article, different games different scales, and the hardware is the same only the developer is different, ergo logical explanation is, it depends up to the game and the game developer? And how do you explain the benchmarks? Heaven and Valley, I tested them myself, near perfect scale, and on Valley I got more than double, and these benchmarks are NOT doing anything to "optimize for Nvidia" as you argue there. So there is nothing wrong with the NVIDIA driver support for the SLI for older cards. Dude, I get driver updates every 7 days for new titles. Or do you think that Nvidia after the release of the 900 series just stopped making drivers for older cards, because they didn't.

 

For your other argument: so you want to tell me that the devs don't have to do anything to get the CrossFire to work? You think they just say use CrossFire and it works out of the box, no man. Both engine developers and game developer have to work really hard to utilize these performances, and they DO because they want their engine/game to perform the best on the worst hardware possible, because they will sell more and earn more. 
Or you just simply like AMD more than Nvidia and you want to say they make better cards, because they don't, for the last 5 years they really don't, none of their cards is better tier for tier, they simple aren't. 


Will it please you if I run XCOM 2, DOOM, Witcher 3 on single card and in SLI to see how it scales? If you don't believe the guys from 3dguru and what they wrote in the article. Or can we just take a look at GTA 5 in this video and see the side to side comparison of GTX 770 2GB MSI versions I have.

And after this agree that we have 50% scale.

 

PS: And I hate to bring this up dude, but I have a Masters Degree in Electrical and Computer science. I have more than 5 years of experience in Software Development, so computers, hardware and software, APIs, and even game dev in Unity are kinda my thing and I would argue I know a thing or two about "devs spend more time optimizing" things, trust me on this. 

First, I said it wrong, Crossfire scales better in general, but that's not thanks to devs, it's actually a better technology than SLI but it's less supported by devs cause as I mentioned, Nvidia sponsors WAY more titles, that means more optimizations for SLI support.

You're wrong in basic things here:

- Yes, you get "driver updates" for new games with optimizations. Those optimizations are not for your card though. Kepler got abandoned in terms of driver optimization a while ago now. This is a widely known fact and you're not actually getting any performance benefits (check that yourself)

- Besides game developers and engine developers you've forgotten AMD's engineers that also need to create a crossfire profile for the game in the drivers.

- Synthetic benchmarks have nearly perfect scaling. It's well-known, no news here, they're to test theoretical performance, not an actual one.

- DigitalFoundry tested 7 games, if you calculated average SLI scaling from those 7 games it'd be around 50%. That's a better and a more accurate testing group than GTA V only that you linked.

-

Quote

Or you just simply like AMD more than Nvidia and you want to say they make better cards, because they don't, for the last 5 years they really don't, none of their cards is better tier for tier, they simple aren't. 

- That's the biggest BS I've seen on the forum ever so far and you know it, maybe emotions took over and deep down you know it isn't true. A very simple example is right here under our noses. GTX 780Ti vs R9 290X. A 780Ti was considered a faster card (and a more expensive one)

look what happens in latest reviews:

perfrel_2560_1440.png

 

The 290X is now 6% faster at 1440p, at 4K the difference grows to 14%(!!!). So AMD did not make a GPU that's better tier for tier than Nvidia for the past 5 years? Pfff.

R9 390 vs GTX 970? 7% difference. These also were more recent equally-tiered cards. Need more proof to this BS statement? I can bring up R9 380 and GTX 960 as well, AMD has the 6% lead again, and again - same tier.

 

P.S. I work in IT. I'm a Systems Engineer in an IT company that specializes in security/storage/backup and archiving of data, and I still can't see how is it related to anything we were talking about here. Also, if you knew so much you'd already know that going SLI with two, 2GB, 2-gen-old, upper-mid ranged cards that don't even get proper software optimizations anymore is just a plain bad idea that introduces much more heat (kepler wasn't efficient), high power-consumption (again - Kepler), and typical issues with dual-GPU setups such as stuttering, bad optimizations, no SLI support in some games whatsoever(Batman: Arkham Knight is a perfect example of a recent AAA title that did not have ANY kind of multi-GPU support, despite being heavily sponsored by Nvidia) and VRAM limitations that will be much more punishing due to two cards being able to run games at higher settings that logically require more VRAM.

 

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

So why is all this personnel preference stuff being argued? People who have never dealt with vram limitations or sli issues yelling others what stuff to buy? Based off what? What they've read? What videos they've watched?

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Morgan MLGman said:

First, I said it wrong, Crossfire scales better in general, but that's not thanks to devs, it's actually a better technology than SLI but it's less supported by devs cause as I mentioned, Nvidia sponsors WAY more titles, that means more optimizations for SLI support.

You're wrong in basic things here:

- Yes, you get "driver updates" for new games with optimizations. Those optimizations are not for your card though. Kepler got abandoned in terms of driver optimization a while ago now. This is a widely known fact and you're not actually getting any performance benefits (check that yourself)

- Besides game developers and engine developers you've forgotten AMD's engineers that also need to create a crossfire profile for the game in the drivers.

- Synthetic benchmarks have nearly perfect scaling. It's well-known, no news here, they're to test theoretical performance, not an actual one.

- DigitalFoundry tested 7 games, if you calculated average SLI scaling from those 7 games it'd be around 50%. That's a better and a more accurate testing group than GTA V only that you linked.

-

- That's BS and you know it, a very simple example is right here under our noses. GTX 780Ti vs R9 290X. A 780Ti was considered a faster card (and a more expensive one)

look what happens in latest reviews:

perfrel_2560_1440.png

 

The 290X is now 6% faster at 1440p, at 4K the difference grows to 14%(!!!). So AMD did not make a GPU that's better tier for tier than Nvidia for the past 5 years? Pfff.

R9 390 vs GTX 970? 7% difference. These also were more recent equally-tiered cards. Need more proof to this BS statement? I can bring up R9 380 and GTX 960 as well, AMD has the 6% lead again, and again - same tier.

 

P.S. I work in IT. I'm a Systems Engineer in an IT company that specializes in security/storage/backup and archiving of data, and I still can't see how is it related to anything we were talking about here. Also, if you knew so much you'd already know that going SLI with two, 2GB, 2-gen-old, upper-mid ranged cards that don't even get proper software optimizations anymore is just a plain bad idea that introduces much more heat (kepler wasn't efficient), high power-consumption (again - Kepler), and typical issues with dual-GPU setups such as stuttering, bad optimizations, no SLI support in some games whatsoever(Batman: Arkham Knight is a perfect example of a recent AAA title that did not have ANY kind of multi-GPU support, despite being heavily sponsored by Nvidia) and VRAM limitations that will be much more punishing due to two cards being able to run games at higher settings that logically require more VRAM.

 

You are missing the point once again. I am not arguing about who has a bigger .... card. 

 

YES, Nvidia guys are dicks for pushing the SLI, I agree. AMD done away with the crossfire bridge/cable long time ago, and technology is similar, multygpu support on both sides, Nvidia can adopt the idea but they wont. I am sure you know how important it is to push your ideas and be the leader in IT and both you and I know they will never abandon SLI. We can go on if one or the other is better but that is not the point here. My point is and always has been from the beginning that I can get 50% performance benefit with SLI, and I know you will agree on this with me. 

Now, Lets get back to the basics of the problem. I am running 1080p at 60Hz monitor. I am NOT GOING TO UPGRADE TO 1440p any time soon, as I said, maybe never as I find this just enough for my daily use, and as I said, at the moment I don't care about the power draw and/or temps because the cards run just fine in my setup. So please don't pull up charts on 1440p gaming on the cards. I know You are bringing up a point about better hardware there and this is a valid comparison, but just remember that R7 200 and R9 200 series are refresh same as GTX 700 is on the GTX 600 models. And also Yes, AMD cards tend to get better as the time passes by, but, that is just due to pore driver support in the past at release (I hear that they have improved a lot in the last year). So these cards come into their place after few iterations on the drivers, but at release they suck :(. And I had AMD card previous to this system, don't get me started on drivers for Windows, and not to mention the drivers for Ubuntu. Getting a bit of topic here. 

 

To run games at 1080P 60FPS, two-way SLI on GTX 770 2GB OC MSI is more than enough. That is my opinion and I don't see any valid proof to prove otherwise except for few games which we can argue if they are valid. (Hitman -> all Nvidia cards run like crap compared to AMD, And Batman: <insert any title here :)> are just plain shit when it comes to implementation, we can agree both both on AMD and Nvidia cards) As I said DOOM and Tomb Raider, If I can't play them at MAX + AA I will drop the AA and or bring down the graphics slider, but I think I can, but without proof I can't say for sure. 

 

Working in IT sure relates to this topic. Most of the people tend to blabber (I am NOT SAYING you are one of them) about things they hear or read online and not really get into the core of the problem or technology. They don't do research on the topic, they don't have good understanding on how CPU/GPU or anything else works, they don't follow up on new technologies, and most important of all they didn't even try and code something, ever, not even "Hello World". And not to mention they don't know what polygons, textures, shaders, texels, are and how GPU uses them. And I was telling you this to address the statement you made on devs don't have to work on support for SLI/CrossFire. They do. We do, from my own experience, in our software we have support for DX9, DX10 and DX11 (i know it is not the same thing as SLI but bear with me here), but you have to have support on both sides if you want your product to be the best, and to work the best on any crap hardware (GTX 770), and you do, not just driver support on manufacturer side (I didn't forget the Nvidia and AMD devs, i just figured it is considered by default). 

 

So as one IT guy to another can we agree that I can get 50% performance from SLI, and that I can "tweak" the games to have 1080p 60FPS with VSync on? 

This has gone way out of the initial topic. Should have asked is GTX 770 SLI enough for 1080p 60FPS on High/MAX settings. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×