Jump to content

nVidia has a GTX1060 with only 3GB of VRAM, but that's not all

zMeul

I don't understand why people are fighting over the "mininmum vram amount" for 1080p, the card has a fucking 6GB option, if you think 3GB is too little, then get the 6GB card, then move on with your life, simple as that.

Specs: CPU - Intel i7 8700K @ 5GHz | GPU - Gigabyte GTX 970 G1 Gaming | Motherboard - ASUS Strix Z370-G WIFI AC | RAM - XPG Gammix DDR4-3000MHz 32GB (2x16GB) | Main Drive - Samsung 850 Evo 500GB M.2 | Other Drives - 7TB/3 Drives | CPU Cooler - Corsair H100i Pro | Case - Fractal Design Define C Mini TG | Power Supply - EVGA G3 850W

Link to comment
Share on other sites

Link to post
Share on other sites

mmmm...this just sounds like another recipe for disaster...like the RX470 vs RX480...wayyy too close but this time, they're nerfing the lesser Vram equipped card :/ 

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, TheKDub said:

I don't understand why people are fighting over the "mininmum vram amount" for 1080p, the card has a fucking 6GB option, if you think 3GB is too little, then get the 6GB card, then move on with your life, simple as that.

Most people in the thread agree with you. The problem is the naming. If they indeed call it the "GTX 1060 3GB", the title won't reflect the fact that the 3GB version has 10% fewer cores. People will expect little to no difference in performance when playing at lower textures/resolution, but it won't be the same performance with a 10% core difference. 

 

As for the people arguing about VRAM mattering, it's all situational at best. Depends on the users preference, the game/workload, other components and their potential bottlenecks, etc. People forget, these budget cards are often paired with budget CPU's and terrible monitors. However, if I start making too much sense, they might stop fighting, and that wont be fun for anyone. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MageTank said:

Most people in the thread agree with you. The problem is the naming. If they indeed call it the "GTX 1060 3GB", the title won't reflect the fact that the 3GB version has 10% fewer cores. People will expect little to no difference in performance when playing at lower textures/resolution, but it won't be the same performance with a 10% core difference. 

 

As for the people arguing about VRAM mattering, it's all situational at best. Depends on the users preference, the game/workload, other components and their potential bottlenecks, etc. People forget, these budget cards are often paired with budget CPU's and terrible monitors. However, if I start making too much sense, they might stop fighting, and that wont be fun for anyone. 

I know a guy who runs 1440p on a R9 280 and 4790K. :D

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, don_svetlio said:

I know a guy who runs 1440p on a R9 280 and 4790K. :D

you are talking about me aren't you? (I have a 4770k not a 4790k)

Before you buy amp and dac.  My thoughts on the M50x  Ultimate Ears Reference monitor review I might have a thing for audio...

My main Headphones and IEMs:  K612 pro, HD 25 and Ultimate Ears Reference Monitor, HD 580 with HD 600 grills

DAC and AMP: RME ADI 2 DAC

Speakers: Genelec 8040, System Audio SA205

Receiver: Denon AVR-1612

Desktop: R7 1700, GTX 1080  RX 580 8GB and other stuff

Laptop: ThinkPad P50: i7 6820HQ, M2000M. ThinkPad T420s: i7 2640M, NVS 4200M

Feel free to pm me if you have a question for me or quote me. If you want to hear what I have to say about something just tag me.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Dackzy said:

you are talking about me aren't you? (I have a 4770k not a 4790k)

Same shit and yes :D

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, don_svetlio said:

I know a guy who runs 1440p on a R9 280 and 4790K. :D

Hey, depending on the task at hand, that is perfectly acceptable. I know people that do that on CS:Go and MMO's. Plenty of GPU horsepower to run those less-intense games at higher resolutions.

 

I personally run a GTX 1070 at 1440p 100hz, and i love it. I get made fun of because i only play Elder Scrolls Online and Overwatch, lol. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, don_svetlio said:

Same shit and yes :D

It can do CSGO, GW2 and R6 at 1440p :D 

Before you buy amp and dac.  My thoughts on the M50x  Ultimate Ears Reference monitor review I might have a thing for audio...

My main Headphones and IEMs:  K612 pro, HD 25 and Ultimate Ears Reference Monitor, HD 580 with HD 600 grills

DAC and AMP: RME ADI 2 DAC

Speakers: Genelec 8040, System Audio SA205

Receiver: Denon AVR-1612

Desktop: R7 1700, GTX 1080  RX 580 8GB and other stuff

Laptop: ThinkPad P50: i7 6820HQ, M2000M. ThinkPad T420s: i7 2640M, NVS 4200M

Feel free to pm me if you have a question for me or quote me. If you want to hear what I have to say about something just tag me.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MageTank said:

Hey, depending on the task at hand, that is perfectly acceptable. I know people that do that on CS:Go and MMO's. Plenty of GPU horsepower to run those less-intense games at higher resolutions.

 

I personally run a GTX 1070 at 1440p 100hz, and i love it. I get made fun of because i only play Elder Scrolls Online and Overwatch, lol. 

I tried VSR 1440 Witcher 3. I ran out of SYS RAM :D

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Dackzy said:

It can do CSGO, GW2 and R6 at 1440p :D 

GW2... I hate that game. Brought both my FX 8320 and Pentium G3258 to its knees. One of the few MMO's where you needed a highly clocked i5 or higher to get the job done, lol. 

 

Just now, don_svetlio said:

I tried VSR 1440 Witcher 3. I ran out of SYS RAM :D

Well, I will be putting my system to work this coming week. Redoing my old RAM trials, and testing CPU overhead again. Got a ton of AAA titles to work with now, which means this GPU's fan will finally turn on for once. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MageTank said:

GW2... I hate that game. Brought both my FX 8320 and Pentium G3258 to its knees. One of the few MMO's where you needed a highly clocked i5 or higher to get the job done, lol. 

LOL it brings my 4770k at 4.5 GHz to its knees, because it is so single core focused

Before you buy amp and dac.  My thoughts on the M50x  Ultimate Ears Reference monitor review I might have a thing for audio...

My main Headphones and IEMs:  K612 pro, HD 25 and Ultimate Ears Reference Monitor, HD 580 with HD 600 grills

DAC and AMP: RME ADI 2 DAC

Speakers: Genelec 8040, System Audio SA205

Receiver: Denon AVR-1612

Desktop: R7 1700, GTX 1080  RX 580 8GB and other stuff

Laptop: ThinkPad P50: i7 6820HQ, M2000M. ThinkPad T420s: i7 2640M, NVS 4200M

Feel free to pm me if you have a question for me or quote me. If you want to hear what I have to say about something just tag me.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Dackzy said:

LOL it brings my 4770k at 4.5 GHz to its knees, because it is so single core focused

I have not played it in a while. Should try it out again with my 6700k and see if it still stutters in HoTM/world bosses. Without going too far off topic, is that thorns expansion worth getting?

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MageTank said:

I have not played it in a while. Should try it out again with my 6700k and see if it still stutters in HoTM/world bosses. Without going too far off topic, is that thorns expansion worth getting?

Get Witcher 3 + Expansions. :D

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, don_svetlio said:

Get Witcher 3 + Expansions. :D

I have the base game, not the expansions. I'll get them if you think they make a difference in benching. I don't really plan on playing the games i purchased, just need them to bench with. Need reliable data that people will likely experience when gaming. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MageTank said:

I have the base game, not the expansions. I'll get them if you think they make a difference in benching. I don't really plan on playing the games i purchased, just need them to bench with. Need reliable data that people will likely experience when gaming. 

Expansions have some improvements IIRC. Not sure how much. Novigrad is still the same. Though I did get actual Gwent cards :D

Archangel (Desktop) CPU: i5 4590 GPU:Asus R9 280  3GB RAM:HyperX Beast 2x4GBPSU:SeaSonic S12G 750W Mobo:GA-H97m-HD3 Case:CM Silencio 650 Storage:1 TB WD Red
Celestial (Laptop 1) CPU:i7 4720HQ GPU:GTX 860M 4GB RAM:2x4GB SK Hynix DDR3Storage: 250GB 850 EVO Model:Lenovo Y50-70
Seraph (Laptop 2) CPU:i7 6700HQ GPU:GTX 970M 3GB RAM:2x8GB DDR4Storage: 256GB Samsung 951 + 1TB Toshiba HDD Model:Asus GL502VT

Windows 10 is now MSX! - http://linustechtips.com/main/topic/440190-can-we-start-calling-windows-10/page-6

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MageTank said:

I have not played it in a while. Should try it out again with my 6700k and see if it still stutters in HoTM/world bosses. Without going too far off topic, is that thorns expansion worth getting?

I don't have stuttering, I can just see 100% load on one core sometimes and my H100i then starts running at full speed to keep it cool, also it is also the only thing that can make my PC crash with any OC. Even a 24hour stress test cannot do that. I like the thorns expansion.

Before you buy amp and dac.  My thoughts on the M50x  Ultimate Ears Reference monitor review I might have a thing for audio...

My main Headphones and IEMs:  K612 pro, HD 25 and Ultimate Ears Reference Monitor, HD 580 with HD 600 grills

DAC and AMP: RME ADI 2 DAC

Speakers: Genelec 8040, System Audio SA205

Receiver: Denon AVR-1612

Desktop: R7 1700, GTX 1080  RX 580 8GB and other stuff

Laptop: ThinkPad P50: i7 6820HQ, M2000M. ThinkPad T420s: i7 2640M, NVS 4200M

Feel free to pm me if you have a question for me or quote me. If you want to hear what I have to say about something just tag me.

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, marldorthegreat said:

All 1000 series gpus are about 100 dollars more expensive.

It makes sense. 

So the 1150 or whatever the stupid name Volta is going to be, will be $300? Because at this point, AMD is trying to keep cards in the same.price range, yet people are still going to buy the fuck out of NVIDIA cards. Because they're fucking NVIDIA. They'd be fucking stupid to make it a 1050, they've clearly said $250 for a 3GB 1060, Why the hell are people making stupid ass assumptions that it is now a 1050.

 

 

i7-6700k  Cooling: Deepcool Captain 240EX White GPU: GTX 1080Ti EVGA FTW3 Mobo: AsRock Z170 Extreme4 Case: Phanteks P400s TG Special Black/White PSU: EVGA 850w GQ Ram: 64GB (3200Mhz 16x4 Corsair Vengeance RGB) Storage 1x 1TB Seagate Barracuda 240GBSandisk SSDPlus, 480GB OCZ Trion 150, 1TB Crucial NVMe
(Rest of Specs on Profile)

Link to comment
Share on other sites

Link to post
Share on other sites

3 or 4 GB wouldnt be bad if the "Games" we play werent so badly optimized. No ? So im thinking its not NVIDIA or AMD´s fault that Games pretty much suck. 3GB should be enough. Now dont take my Oppinion to serious, im just a Guy that has 0 Idea about GPU´s and only a small Plan of how Games work/what they are.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, somebody* said:

3 or 4 GB wouldnt be bad if the "Games" we play werent so badly optimized. No ? So im thinking its not NVIDIA or AMD´s fault that Games pretty much suck. 3GB should be enough. Now dont take my Oppinion to serious, im just a Guy that has 0 Idea about GPU´s and only a small Plan of how Games work/what they are.

What? Ubisoft is paving the way! Their games are so hardcore, you need hardware that doesn't yet exist to properly enjoy them. Besides. Nvidia Gameworks is a staple. How else will I be able to enjoy excessive beard physics? 

 

No but seriously. People need to take a page out of Naughty Dog's book. Uncharted 4 looks way too good to be a PS4 game, and it puts a lot of these PC games to shame. Imagine if we had a game tailored to PC hardware like they did with the PS4? I know it's easier to do it with the PS4, since its a closed ecosystem when it comes to hardware (No Intel vs AMD, no AMD vs Nvidia, etc)  and no varying degree of details to have to account for, but It would be absolutely amazing to see what could be done if a publisher gave their devs proper resources and time to pull off a masterpiece. I just hope I live long enough to see that happen. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, DarkBlade2117 said:

So the 1150 or whatever the stupid name Volta is going to be, will be $300? Because at this point, AMD is trying to keep cards in the same.price range, yet people are still going to buy the fuck out of NVIDIA cards. Because they're fucking NVIDIA. They'd be fucking stupid to make it a 1050, they've clearly said $250 for a 3GB 1060, Why the hell are people making stupid ass assumptions that it is now a 1050.

they can overcharge because AMD has nothing, at all, to compete. The 1060 is better than the rx 480, the 1070, 1080 and titan xp have no competition. Its the start of the nvidia monopoly, like the intel monopoly on high end cpu's. 

Hello This is my "signature". DO YOU LIKE BORIS????? http://strawpoll.me/4669614

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Majestic said:

No.

 

So now 4GB isn't enough. At what point does this still revolve around required memory for your typical gaming scenario, and start drifting into terrible games optimization and actively looking for scenario's that choke memory requirements.

 

3 out of 5 examples are Ubisoft Montreal games. Talk about having a common denominator, and no doubt the reuse of engines and assets. The other example, rise of the tomb raider, i've not seen stuttering or hitching on my own 4GB card but it does take a performance hit on v. high textures. As for Metro, that game has some big performance variance when testing. I've never seen any loop give me <5% variance to the previous/next one.

 

But in all seriousness, as if the x60 series has ever been a considered a staple for gaming at Ultra settings at any point. You buy midrange cards, knowing you get a next-to-perfect experience barring a few concessions. If it means dropping the textures (meant for higher resolutions and GPU's) with zero-to-none visual hit, that's perfectly fine for most folks.

 

Games now have to accomodate for a wider variety of resolutions, that also means having higher resolution texture packs. You can't use a the same texture packs for 4K as you do for 1080p. Setting everything to maximum may not even make sense at 1080p, as the PPI is way too crap to notice the uptick in texture resolution. It's reached the point where developers retro-actively remove their higher texture packs because people with midrange cards start complaining about poor performance, or solicit warnings when selecting them.

 

This whole discussion is stupid. Anyone can find scenario's where a card chokes for memory. You can even find them for the Titan X. That does not imply the settings where A. designed for that card, or B. make any sort of sense. And unless games start becoming unplayable on lower memory cards at reasonable settings, this OCD on memory needs to stop. Have you seen pricing of graphics cards lately? Think that has nothing to do with us going from 2GB to 8GB on midrange units in the span of a year?

 

You were being deliberately inflammatory with that post. Don't even pretend to call for the moral highground here, mr. memes.

R6 Siege does NOT use the same engine as Far Cry or AC. They have different engines for different games. Fact. Ubisoft should get props where props are due. They are very good at making engines for a use case, and not do like say EA which uses Frostbite 2/3 for everything, or the Unreal engine which is applied to lots of things, and not being exceptional at anything else then FPS based games.

 

Rainbow Six Siege and some of the later AC games uses the Anvil Next Engine

Far Cry 4/Primal uses the Dunia Engine 2

unrelated to the discussion:

The Division (also Ubi game) uses a brand new enigine called Snowdrop Engine

 

It is not about whether it is "stable" on "Ultra". It is about the fact that Ultra is already this demanding. Even at high, you don't get great framerates, you get so so framerates. A 290X, which is what i got, is comparable to a 480 or 1060. Fact, we know this from all the reviews around the web. It is roughly same GPU power, maybe a bit faster then the new cards.

There is not a single operation short of professional grade use cases where you can "choke" a titan X on memory. Fact.

Which makes it a strawman argument.

 

Unfortunatly, whilst the monitor is "1080p" you also have to consider VSR/DSR, which do work separatly and diffrently from ingame Supersampling. Although the end result is the same, almost (better scaling with GPU based SS). This means that the 1080p card can be used for even HIGHER loads.

 

Blaming Metro for having variables? Well, Tom's Hardware, PcPer, Gamers Nexus and many other consider it a reliable benchmark. Between you and these aforementioned reviewers, the credibility of the reviewers is n^4 higher then your's is atm.

 

The discussion is perfectly fine, however it seems that the direction the discussion is heading does not correlate well with your perception of reality. That is not the fault of the discussion, but rather the fault of your subjective opinions.

Following your logic, a demanding game is simply unoptimized. A game with lower quality assets, but stable performance is heavily optimized. Further on, if modern games are only going to be requiring "yesterdays" hardware to peform, then we don't have to improve hardware at all. It would be better if AMD and Nvidia didnt waste money on new products then. Wouldn't it?

 

No, we MUST move on. Games MUST improve. Hardware MUST improve.

 

However, i must ask, if 3GB is enough today, or if 4GB is enough today. Why the fuck does AMD and Nvidia make cards with 6-12GB???? With the current density of DRAM stacks, they could BOTH just go with less memory dies to save manufacturing cost. Wouldnt be the first time either one of the GPU makers would skimp on something in order to save money.

Fact is. VR is not the reason they are adding more VRAM either. VR needs processing power, not high detail assets. Infact, most VR games have LOWER quality assets in order to maintain the required framerate.

 

The 4K "craze" that is coming has little to do with this too. The adoption rate of 4k displays and monitors are still at an all time low. Even among TV sales, 1080p is still the dominating resolution, despite 4k TVs becoming increasingly affordable.

AMD and Nvidia could both have waited one more generation before having to adress "the 4K craze" with higher VRAM amounts, if we were to base our "VRAM needs" on games for 2-3 years ago.

 

The pricing of DRAM also doesnt matter much here. A single chip of 1-2GB costs 15$ for a single chip. AMD and Nvidia buy bulk, so they are probably paying 10-12$ per DRAM DIE, maybe even less. What do i know. What i do know is that the cost of DRAM, given its high yields, low manufacturing cost and widespread availibility is NOT the factor of increased GPU prices. That factor is elsewhere, 99% sure it is purely a question of margins, and not actual manufacturing cost.

 

Marketing is not a factor either, as manufacturing costs trumps wasting money on 1-upping your competitor on useless things. yeyeye, some say the 24GB vs 32GB Quadro/Firepro thingy was "just" a marketing ploy, but there are professional grade software that is more then happy to use those amounts of DRAM, so that discussion is just a huge circkle-jerk in the first place.

 

You are right that todays games need to accomodate for a variety of resolutions, but if i were a game dev, i would look more into "scalable solutions" rather then adding multiple versions of the same file. This is also why games behaves diffrently between 21:9 and 16:9/16:10, because the assets used may or may not like being stretched, as such some games solves the issue by simply "quietly" extending the FOV for 21:9 players. Causing an unfair advantage, but ultimately circumventing the need to add additional 21:9 aspect textures. or textures with additional scaling metrics which simply adds complexity.

 

If we look at games 2 years back, which was in the "hayday of 2GB is fine for 1080" discussion, then we realize that these games, under closer scrutiny, used more advanced post processing and effects, like ambient occlusion and other shading effects, to make a more "contrast and color" based visual experience rather then a "lines and shape" based one, which is what you get with proper usage of textures. The heavy usage of tesselation in the most recent years has also amplified the need for good shading rather then raw texture data.

This is just merely an observation i have made while playing games (trees and brick walls are great things for comparison, as no matter what game you play, the kind of texture quality you are expecting to see shouldnt differ that greatly in AAA titles). And ive also noticed this in videos where Digital Foundry does close up shots. Mostly this happens in "console vs PC" videos, But at higher zoom levels or being close to the object in question make it very clear where the focus for game developers has been lately.

It has been on post processing and shader effects, not the basic visuals (textures and meshes)...

Post processing and shader effects, and warp effects such as tesselation, barely take any VRAM resources.

 

I am not sure why you always seem to come running into these sort of topics, acting like a white knight crusading against the heretics. Perhaps all you want is to create controversy, if so, congrats you succeeded.

 

I want to ask you though, if we want to test for VRAM deficiency, how do we do this? How would you do it?

I know a bomb proof way that will show it right away. But i will not share. I want to hear what kind of testing methodology, and the reasoning behind it, first.

 

As for the screenshots... Let me do a few more for you, OK?

1080p, High preset, Very High preset and Ultra Preset. No custom tuning of settings. Just presets.

I will also run in window mode so there is JUST A SINGLE 290X RUNNING. AKA the closest i have to 1060/RX 480 performance

I will also complement these settings by doing a canned inbuilt benchmark run to show the performance difference.

To keep things "clean" we shall list the test setup

  • CPU: Intel i7 4790k stock clocks (4.0GHz/4.4Ghz turbo)
  • CPU Cooler: Cooler Master Nepton 240
  • Motherboard: ASUS Maximus HERO VII
  • Main memory: Kingston Savage, 32GB DDR3, 4x8GB, 1866Mhz CL9
  • Storage: Samsung 850 Evo 250GB (system), Samsung 850 Evo 500GB (steam folder)
  • GPU: XFX Radeon R9 295x2 Core edition. Stock core clocks, Memory at 1400Mhz. Power limit at +50%.
  • Powersupply: Corsair AX1500i. 1500w,  80+ Titanium. with Corsair AX Individually sleeved cables -> mains power is 233v IT
  • Ambient temperatures: is 19-22c (aircon is running, but my room is a bit far away from it so room heats up faster then cold air-con air can cool it) 

Now for the screens and tests:
1080p High preset:

Spoiler

s5kw0If.png

 

0eoT6ok.png

 

np5uui0.png


1080p Very High Preset:

Spoiler

kMQKZ38.png

 

8JYxpnD.png

 

1080p Ultra Preset: (Note, this setting uses -380 MB LESS then my 4072MB limit. Watch how much getting CLOSE to the limit affects the framerates. Despite textures still being the 4k DLC "ULTRA" textures in both tests)

Spoiler

kJxCuC7.png

 

nxnisIr.png

 

 

OK, Question time. Does this games in-game "VRAM meter" actually matter? What is the difference between being above and below the VRAM limit?

I did a test where i was -26MB below, and a test where i was +24MB above the 4072MB limit. These are the results (still 1080p window mode) And what happens when you go WAY above the limit? I also tested going 576MB above (1080p MAXXXX settings)

 

-26MB test

Spoiler

C6qlkTH.png

 

VqW9lQW.png

 

+24MB test

Spoiler

3ZyqXeZ.png

 

UllVr49.png

 

 

+576MB test

Spoiler

FS0ElrK.png

 

1gqwC3c.png

 

 

For those with a tinfoil hat. HBAO+ in general, despite being a GAMEWORKS effect. Here is results WITH and WITHOUT.
Ultra Preset without HBAO+

Spoiler

mWAWEWK.png

 

hty96sF.png

 

Ultra Preset WITH HBAO+

Spoiler

e2qjs2d.png

 

LAKHjlI.png

 

Yes HBAO+ is taxing, but in general not enough to actually make a big enough impact to blame it on that alone. The benchmark differences with and without correlates with what ive seen when testing HBAO+ on/off in other games such as TW 3, RotTR, FC 4 and FC Primal.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, MageTank said:

What? Ubisoft is paving the way! Their games are so hardcore, you need hardware that doesn't yet exist to properly enjoy them. Besides. Nvidia Gameworks is a staple. How else will I be able to enjoy excessive beard physics? 

 

No but seriously. People need to take a page out of Naughty Dog's book. Uncharted 4 looks way too good to be a PS4 game, and it puts a lot of these PC games to shame. Imagine if we had a game tailored to PC hardware like they did with the PS4? I know it's easier to do it with the PS4, since its a closed ecosystem when it comes to hardware (No Intel vs AMD, no AMD vs Nvidia, etc)  and no varying degree of details to have to account for, but It would be absolutely amazing to see what could be done if a publisher gave their devs proper resources and time to pull off a masterpiece. I just hope I live long enough to see that happen. 

point in case. here is screens i took from Uncharted 4 on my PS4....

http://imgur.com/a/sbkTB

Link to comment
Share on other sites

Link to post
Share on other sites

I suspect the 3gb version was originally supposed to be the 1060 and come out first, but Nvidia decided to react to AMD's 480 and release the 6gb version first,

 

I think the 6gb was only supposed to come out a little later this year as the 1060ti, But Nvidia knew that the 480 would be far better value that the 3gb 1060 and rushed it out.

 

Either way the 3gb 1060 is a nothing card..you aswell getting the 470 for around the same money.

----Ryzen R9 5900X----X570 Aorus elite----Vetroo V5----240GB Kingston HyperX 3k----Samsung 250GB EVO840----512GB Kingston Nvme----3TB Seagate----4TB Western Digital Green----8TB Seagate----32GB Patriot Viper 4 3200Mhz CL 16 ----Power Color Red dragon 5700XT----Fractal Design R4 Black Pearl ----Corsair RM850w----

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Trik'Stari said:

At 1920x1080? Or 5760x1080? Because I never hit that much on single screen with my 970. I can max literally everything. If I turn on triple monitor surround, then I can easily go well above 3.5/4 gigs of VRAM

 

Or do you have DSR turned on or something?

1920x1080

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, MageTank said:

My brother hits about 3.2-3.3gb at 1920 x 1080 with everything on max. This is what's seen in Afterburner OSD, not in-game GTA options. Though, he has 3 monitors (only one showing the game). If the game is played in fullscreen exclusive on just one monitor, will the others being used by the same GPU for Skype/Teamspeak monitoring make the overall VRAM usage increase? I have personally never used a multi-monitor setup, so I genuinely do not know, lol. 

 

Ding Ding Ding! Took someone long enough to get it. Architectural changes have an impact on VRAM (both capacity and bandwidth). However, it's not magic. It has its limitations. That being said, using Kepler's potential lack of VRAM for an analogy when talking about Pascal (2 generations newer) it's a difficult comparison to make. Kepler lacks the level of DCC used in Maxwell, let alone the improved one used in Pascal.

 

Why these people are fighting to such an extent is beyond me. Regardless of how much VRAM you throw on the 1060, it's still a sub-1080p card these days. A 1060 won't be able to max out every title at 1080p, even if it had 8GB VRAM. Some things are just too demanding in terms of raw horsepower, and no amount of VRAM or VRAM bandwidth will save them. 

 

As others said, the nomenclature issue is getting old, and there is no reason to call an entirely different card the same name. Once is a problem, but twice is unforgivable. If this card was an x50 card, nobody in this thread would be complaining. AMD's Entry-Level gaming cards have 2GB versions as well, so a 3GB entry level gaming card wouldn't/shouldn't bother anyone. Again, these cards will hit other limitations before VRAM becomes an issue 90% of the time. 

Don't know.

 

I tend to turn my other monitors off when playing in single screen, or the game I'm playing doesn't use that much Vram to begin with (MMO's)

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×