Jump to content

You thought GTX 1060 was confusing? You've seen nothing. Enter RTX 2060 and 6 variants of the same card

Bouzoo

Projecting forward a bit, but with Nvidia now having their "main" stack all with Async Compute, Nvidia has a lot of interest in pushing Game Devs to implement with it. What that means is by 2020, when Nvidia's 7nm GPUs go on sale, there'll be a significant gap between Pascal & Turing, which makes the jump to 7nm far more useful for those with high-end Pascal GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, mr moose said:

??  that makes no sense.

Of course I'm comparing two different era's, why do you think I said it was so much better these days?

 

Not too sure what kind of world you lived in but to get the best when you only had 4 options was not an extra $40,  that is what it is like today, back then you had to double your budget because it went from shit average at $150 to the best at $300+  there was no 5 iterations in between like there is today.  The closest we cam was when voodoo where still in the game because they had a few options in between nvidia and some S3 variants. 

If 3DFX hadn't been so slow at updating their designs (and decided that they'd be the only ones manufacturing their cards), we'd probably still be seeing situations where a graphics card was able to hold its own against multiple generations from Nvidia.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Dabombinable said:

If 3DFX hadn't been so slow at updating their designs (and decided that they'd be the only ones manufacturing their cards), we'd probably still be seeing situations where a graphics card was able to hold its own against multiple generations from Nvidia.

3Dfx's "We went to be the Intel of GPUs" move was a top 10 biggest boneheaded moves in semi-conductor history.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, mr moose said:

??  that makes no sense.

Of course I'm comparing two different era's, why do you think I said it was so much better these days?

 

Not too sure what kind of world you lived in but to get the best when you only had 4 options was not an extra $40,  that is what it is like today, back then you had to double your budget because it went from shit average at $150 to the best at $300+  there was no 5 iterations in between like there is today.  The closest we cam was when voodoo where still in the game because they had a few options in between nvidia and some S3 variants. 

I'm not talking about that distant past. I'm talking about the GeForce/Radeon era already.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, Taf the Ghost said:

Projecting forward a bit, but with Nvidia now having their "main" stack all with Async Compute, Nvidia has a lot of interest in pushing Game Devs to implement with it. What that means is by 2020, when Nvidia's 7nm GPUs go on sale, there'll be a significant gap between Pascal & Turing, which makes the jump to 7nm far more useful for those with high-end Pascal GPUs.

I think by that time, the performance improvement in a wider range of workloads would be more significant 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, D13H4RD said:

I think by that time, the performance improvement in a wider range of workloads would be more significant 

To an extent, though the issue is that the DX11 era has hung on a lot longer than originally expected. Nvidia's DX12 support was a disaster fire until right about when RX Vega launched. This is why the places you'll see the 2060 beat the 1070 Ti will be in the titles that leverage Vulkan/Async shaders already.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Taf the Ghost said:

To an extent, though the issue is that the DX11 era has hung on a lot longer than originally expected. Nvidia's DX12 support was a disaster fire until right about when RX Vega launched. This is why the places you'll see the 2060 beat the 1070 Ti will be in the titles that leverage Vulkan/Async shaders already.

I think the exponential curve is being talked head on right now too, each leap is requiring more and there is a disparity between that required leap and the advancement of GPU technology or the cost to make it happen. Because of that large resource requirement, the shift to industry common game engines, fewer game development studios are doing that generational leap effort and stick with what they can do best now.

 

Even big powerhouses who traditionally burn a game engine with every game are becoming no longer willing to do that, Square Enix. It didn't help either that Luminous Engine was a technical nightmare to develop. Early demos of that had me very interested, now Square Enix is another Unreal Engine user.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RejZoR said:

Your logic is straight up bizarre. I'm surprised you're not also complaining how your hatchback only has 150 HP when it clearly should have had 700 HP for 15.000 bucks... And you're comparing an era of greatest visual evolution (meaning everything got obsolete very quickly) in games to today's era of stagnation (where 5 generations old high end card still runs things relatively well). Of course anything runs anything smoothly when graphics haven't really moved anywhere for years lol...

 

If you can't step up I don't know, 40 bucks for a significant performance upgrade on a 300€ card, then what are you even doing with your life? Granulating models down to every 10 bucks is retardation that caused Gigabyte to have freaking 50 versions of upcoming RTX 2060... It's not ttheir fault or NVIDIA's fault, it's you ppl demanding this nonsense. Uh oh, I can't pay up 40 extra bucks for a 300€ purchase that will last me for several years, plz cater to me with yet another model that will cost exactly those 10 bucks more istead of 40. Yeah, you ppl created this, not the vendors.

I'm pretty sure the vendors create the cards, and variations in the cards. Companies creating a product that is more likely to be sold is generally how it goes too, if people ask, people buy. Performance or advancement has little to do with how companies are directed in this regard, it's a business to make money, if performance gets them to sell more, then thats what they'll do, otherwise why bother? Making the same money with less investment is still better. 

Link to comment
Share on other sites

Link to post
Share on other sites

Or you could just pay up a bit to step up to a faster model. If ppl had no options, they would (it's how worked when entire lineup of one generation had 5 models). Instead they want a version for every 20 bucks heh... ridiculous.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, RorzNZ said:

I'm pretty sure the vendors create the cards, and variations in the cards. Companies creating a product that is more likely to be sold is generally how it goes too, if people ask, people buy. Performance or advancement has little to do with how companies are directed in this regard, it's a business to make money, if performance gets them to sell more, then thats what they'll do, otherwise why bother? Making the same money with less investment is still better. 

Nvidia, and AMD, only allow authorized graphics card designs and place other restrictions like requiring a certain ratio of cards are sold. That can mean even if the GTX 1070 for example is becoming very popular and increasing production in that would make the most sense (and decreasing price) they are required to sell twice as many 1060's or some other amount of 1080's.

 

What AIBs do reality is not organic or directly influenced by customers and their buying habits.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, leadeater said:

I think the exponential curve is being talked head on right now too, each leap is requiring more and there is a disparity between that required leap and the advancement of GPU technology or the cost to make it happen. Because of that large resource requirement, the shift to industry common game engines, fewer game development studios are doing that generational leap effort and stick with what they can do best now.

 

Even big powerhouses who traditionally burn a game engine with every game are becoming no longer willing to do that, Square Enix. It didn't help either that Luminous Engine was a technical nightmare to develop. Early demos of that had me very interested, now Square Enix is another Unreal Engine user.

ML/DL/AI and various other technical eras happened. I don't think Game Engine difficulty has increased as much as Game Engine Designers has dropped, in quality, rapidly over the last 20 years. If you're top flight at that type of work, you'll make 3x as much in another area of GPU development. There's a reason the main iD games are still done by a staff of 40 (with some outsourced art resource help). They've sat on such a pile of cash that they can hire in only who they want, as a result people only leave very slowly. (And everyone makes bank.)

 

You see this problem in a lot of fields that take hard direction shifts. Got a friend that's a Math guy. (Plus a bunch of other degrees in I forget what. You can understand the type.) He explained to me one time the disaster area that Research Math has been in for nearly 20 years due to everyone wanting "Quants" for Wall Street. Why spent 5-7 years getting a PhD when Wall Street was calling with 250k+ starting jobs with a Masters. Ever the logical ones, Wall Street was able to pretty much pillage an entire generation of Math Research into what we now call "High Frequency Trading".

 

GPU development has undergone a similar trend in the last decade. Even Nvidia, AMD and Intel have clearly shifted their best developers away from the Gaming side of the equation. There's a reason all of the Compute development has been happening.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, leadeater said:

Nvidia, and AMD, only allow authorized graphics card designs and place other restrictions like requiring a certain ratio of cards are sold. That can mean even if the GTX 1070 for example is becoming very popular and increasing production in that would make the most sense (and decreasing price) they are required to sell twice as many 1060's or some other amount of 1080's.

 

What AIBs do really is not organic or directly influenced by customers and their buying habits.

Those ratios would be heavily dependent on production yields and model demands. nvidia- "No you can't have any more 1080's because we have a tonne of failed chips here that we have to laser and turn in to 1070's".

 

According to wikipedia even some of the 6G 1060's used the 104 chips. Don't know how why or if that's even true, but if so it would link the shortage of the 1080's at the start of the year to the slight over supply of 1060's the channel toward the end.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, mr moose said:

According to wikipedia even some of the 6G 1060's used the 104 chips.

That's extremely recent, I'm not sure that has much to do with anything other than Turing coming in making the value of those GP104 dies less on the market and having enough of them to do it.

 

7 minutes ago, mr moose said:

Those ratios would be heavily dependent on production yields and model demands. nvidia- "No you can't have any more 1080's because we have a tonne of failed chips here that we have to laser and turn in to 1070's".

That and how much Nvidia or AMD would like to produce, they will naturally want to do more of the dies that make them the most profit rather than ones which make less.

 

You can see that there is much more profit in the lesser dies and much more volume because they all have their own die rather than in the cases of x70 and up sharing a die with different enabled core configurations. Bad GP106 dies are not turned in to GP107 i.e. GTX 1050, they simply are bad and the silicon recycled.

 

Also the vast majority of GP104 dies being used in GTX 1070 cards were destined for that use rather than the act of binning or yield recovery efforts.

 

12 minutes ago, mr moose said:

but if so it would link the shortage of the 1080's at the start of the year to the slight over supply of 1060's the channel toward the end.

I remember hearing partly that was caused by 1080s being popular enough and AIBs having to, by contractual requirement, order more GP106 dies than they actually wanted. AIBs want to sell more high end card because they get better return on it and Nvidia wants to sell more mid range dies because that makes the most return for them. Both equally love the big ticket GPUs though so it's not like Nvidia really has a specific preference other than just wanting to sell everything as much as they can, managing the cost is important though so they aren't going to produce more GP102s than would actually sell or cause the price to drop below target price.

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, Taf the Ghost said:

I don't think Game Engine difficulty has increased as much as Game Engine Designers has dropped, in quality, rapidly over the last 20 years. If you're top flight at that type of work, you'll make 3x as much in another area of GPU development.

Well that is the typical story of lack of competition, if there is no longer competition in the game engine sector or game marketing specifically tied to it the progress slows. Crysis was a big deal because of the game engine, the graphics and the technical brag factor and we rarely see that now.

 

Anyone can make a wonderful meal with the same access to the same ingredients, some can do a better job than others. Others might set sail over great seas to find a new land to find new ingredients to bring that back and show that off, with a side expedition of taking over that land and enslaving the population a.ka. the British.

 

I can respect a terrible game that pioneers something new, changes the industry or sets a new bar.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, leadeater said:

That's extremely recent, I'm not sure that has much to do with anything other than Turing coming in making the value of those GP104 dies less on the market and having enough of them to do it.


well yea yesterday i watched some vid, talking about gigabyte releasing some new models into the market, 1060's, indeed based on the same chip as the 1080... now that's weird right?

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, leadeater said:

Well that is the typical story of lack of competition, if there is no longer competition in the game engine sector or game marketing specifically tied to it the progress slows. Crysis was a big deal because of the game engine, the graphics and the technical brag factor and we rarely see that now.

 

Anyone can make a wonderful meal with the same access to the same ingredients, some can do a better job than others. Others might set sail over great seas to find a new land to find new ingredients to bring that back and show that off, with a side expedition of taking over that land and enslaving the population a.ka. the British.

 

I can respect a terrible game that pioneers something new, changes the industry or sets a new bar.

The day WoW went live was the day Game Engine Development was put on death watch. Gaming, while always a business, was no longer under the sway of very small development teams. That meant Corporate Thinking, which means the money for the really smart programmers went else where. Now, Game Designers really have gotten a lot better in the last 20 years (that's where the money is at), but Engine Development has commoditized. As a result, the truly top talented have left the space.

 

iDtech 6 is amazing, but that's mostly iD off doing their own thing. The only Engine work that's made bank since WoW? Minecraft. Unreal makes Epic good revenue, but the huge money has always been in having the platform 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Taf the Ghost said:

Minecraft

Ok that is the one game that I will brake my rule, I do not respect that game even though it changed the industry. Minecraft ?

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

Ok that is the one game that I will brake my rule, I do not respect that game even though it changed the industry. Minecraft ?

Here's a harsh reality that won't be accepted until most reading this are extremely old: Games like Minecraft, MMOs or MMO-lites, will be viewed as some of the most destructive products of the current age. It's going to be looked up about like asbestos is now.

 

A human only has a certain amount of "creative energy potential" after each sleep cycle, and something like Minecraft and easily burn all of it for years on end.  They'll be looked upon as grinding human potential into dust and made a point of shame. If not outright banned by several countries in a few decades. I wish both the effort and the likely future were jokes, but they most definitely are not.

 

We're already seeing, among the highly connected, a near ban on Screen-based devices around their children. Silicon Valley will keep pushing devices everywhere while, simultaneous, keeping them away from their own children.

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Taf the Ghost said:

Here's a harsh reality that won't be accepted until most reading this are extremely old: Games like Minecraft, MMOs or MMO-lites, will be viewed as some of the most destructive products of the current age. It's going to be looked up about like asbestos is now.

 

A human only has a certain amount of "creative energy potential" after each sleep cycle, and something like Minecraft and easily burn all of it for years on end.  They'll be looked upon as grinding human potential into dust and made a point of shame. If not outright banned by several countries in a few decades. I wish both the effort and the likely future were jokes, but they most definitely are not.

 

We're already seeing, among the highly connected, a near ban on Screen-based devices around their children. Silicon Valley will keep pushing devices everywhere while, simultaneous, keeping them away from their own children.

What a pessimistic outlook.  So long as the games they play are challenging mentally and people engage in a generally healthy lifestyle (including mental health and activity) then there should be no degradation.  

 

https://www.psychologytoday.com/au/blog/iq-boot-camp/201407/10-ways-improve-your-brain-health

https://www.health.harvard.edu/blog/regular-exercise-changes-brain-improve-memory-thinking-skills-201404097110

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mr moose said:

What a pessimistic outlook.  So long as the games they play are challenging mentally and people engage in a generally healthy lifestyle (including mental health and activity) then there should be no degradation.  

 

https://www.psychologytoday.com/au/blog/iq-boot-camp/201407/10-ways-improve-your-brain-health

https://www.health.harvard.edu/blog/regular-exercise-changes-brain-improve-memory-thinking-skills-201404097110

It's not pessimistic. It's observations of reality, what is already underway and the nature of how humans operate.  We'll pretty quickly enter a time of "for thee, not for me" among the different sections of society. We've already seen this with the Upper 10% of the US population and Sugar. It's part of the same reason for the anti-Porn swing from the factions that are otherwise anything but prudes. Net-Effects and Cascade Consequences. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Taf the Ghost said:

It's not pessimistic. It's observations of reality, what is already underway and the nature of how humans operate.  We'll pretty quickly enter a time of "for thee, not for me" among the different sections of society. We've already seen this with the Upper 10% of the US population and Sugar. It's part of the same reason for the anti-Porn swing from the factions that are otherwise anything but prudes. Net-Effects and Cascade Consequences. 

I think you might be putting a little too much effort into this. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/27/2018 at 5:02 PM, cj09beira said:

most people don't put that much thought into it, all they do is grab which ever they can afford 

Not when stores only stock up on the most popular variant and when the price difference between cards are so marginal people might save an extra 10$ for one that performs 10-15 fps better, at the beginning sure for regular people

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/26/2018 at 1:31 AM, Bouzoo said:

3 versions of RTX 2060 cards, with 3, 4 and 6GB of VRAM. Even better, each one of those will have GDDR5 and GDDR6 variants.

I believe this is a trick, if they wont tell you whats inside ddr5 or ddr6 then its them trying to scam you. Most retailers wont advertise that, thats why I think its a scam. Pay ddr6 prices on ddr5 unit.

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

You thought GTX 1060 was confusing? You've seen nothing. Enter RTX 2060 and 6 variants of the same card

 

I can't wait until the RTX 2060 and GTX 1160 are the same card but one doesn't have the additional circuitry for RT, DLSS, and other stuff. 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Canada EH said:

I believe this is a trick, if they wont tell you whats inside ddr5 or ddr6 then its them trying to scam you. Most retailers wont advertise that, thats why I think its a scam. Pay ddr6 prices on ddr5 unit.

They have to advertise. Afaik, according to EU laws at least, it has to say on the box exactly what you're buying. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×