Jump to content

NVIDIA just made EVERYTHING ELSE obsolete.

Emily Young
4 hours ago, ageekhere said:

In australia on gumtree the 2080ti price is all over the place ranging from $700 to $2000 AUD. People are still trying to sell the 1080TI for $600 to $1000 AUD

of course they degenerated to "investments" instead of obsolete computers parts.... people spent good money for what they made to think of as a "technological miracle" (instead of standard technological evolution that will become obsolete fast as all electronics do and did for decades) and think that they sould get reimbursed for that investment. 

 

That's Nvidias fault with it's applesque tactics and also all "tech-journalists" fault present party not excluded who aided either for profit or out of naiveness or a mix of both to promote such tactics instead of criticizing them. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/2/2020 at 6:00 AM, papajo said:

You probably did not understand my initial post because that was exactly my point. 

 

Or in other words a 300 price drop on a 500$ greedy price inflation(=1000$ total) is as little as possible and in no way a deal. 

 

500$ is the maximum a rational human being should spend on his graphics card (and that graphics card for that price should be a flagship)  even if you go to the WC and poop 100$ bills it doesnt mean that you need to throw them blindly I am pretty sure that even rich people try to excuse their expenses. 

I have to agree here.

 

Consider that not that long ago, the generational jumps were BIG, and the prices were reasonable.

 

Take the 980Ti to GTX 1070. 1070 was as fast as 980Ti, last generations flagship, for $379. In that era, 980Ti launch price was $649.

 

So by that logic, the 3070 being as fast or faster than the 2080Ti is "normal". And the price of $499 for the 3070 is actually rather high for a mid-tier card.

 

So just because Nvidia:

- A. Didn't make a big leap performance wise with the 20 series (2070 was not as fast as a 1080Ti).

- B. Overpriced the 20 series by a LOT.

 

DOESN't make this launch a SUPER DUPER DEAL. It just looks good in comparison with the 20 series complete PRICE GAUGING and downright BAD VALUE.

 

This is more like a return to normal. And even then, $499 for a 3070 series card is more of an increase than inflation compared to $379 for the 1070 just 4 years ago.

 

I am tempted to get a 3070, but I consider it just OK value, not a DEAL by any means.

Link to comment
Share on other sites

Link to post
Share on other sites

Historical initial MSRP of the top tier cards are as follows.

Excludes Dual GPU cards and OC'ed specials. (feel free to correct me on anything listed, just provide links as proof... cheers)

 

Launch Year ------- GPU ------------ Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -------$500 -----$685

2005 --- GeForce 7900 GTX -------$500 -----$665

2006 --- GeForce 8800 GTX -------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275   (no this isnt a typo)

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 Ti -----$700 -----$775

2015 --- GeForce GTX 980 Ti ---- $650 -----$710

2017 --- GeForce GTX 1080 Ti ----$700 ----$740

2018 --- GeForce RTX 2080 Ti ---$1200 ---$1230

2020 --- GeForce RTX 3090 ------$1500 --- $1500

 

Spot the outliers.

 

Now we can argue and debate about the 3090 being a Titan or not, if it is then fine, but we should then accept that a 3080ti is to release in the future as the 3080 isnt close enough in performance to the 3090 (as per Nvidias release event graphs) to be the top tier gaming card. (all previous gen Titans had a gaming card that was near identical in performance).

As such this leaves Nvidia the chance to either do what they have done in the past, which is release the new x80ti at the same prices as the x80 and drop the x80 price.... or do what they seem more likely to do nowadays and increase the x80ti price to somewhere between the 3080 and 3090, likely $900-$1000.

 

~$700 (in todays currency) are not uncommon as we can see, but above that is unacceptable.

 

Hope this clears up the debate over price.

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, SolarNova said:

Historical initial MSRP of the top tier cards are as follows.

Excludes Dual GPU cards and OC'ed specials.

 

Launch Year ------- GPU ---------------------- Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -----$500 ------$685

2005 --- GeForce 7900 GTX ------$500 -----$665

2006 --- GeForce 8800 GTX ------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275   (no this isnt a typo)

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 Ti -----$700 -----$775

2015 --- GeForce GTX 980 Ti ---- $650 -----$710

2017 --- GeForce GTX 1080 Ti ----$700 ----$740

2018 --- GeForce RTX 2080 Ti ----$1200 ---$1230

2020 --- GeForce RTX 3090 -------$1500 --- $1500

 

Spot the outliers.

 

Now we can argue and debate about the 3090 being a Titan or not, if it is then fine, but we should then accept that a 3080ti is to release in the future as the 3080 isnt close enough in performance to the 3090 (as per Nvidias release event graphs) to be the top tier gaming card. (all previous gen Titans had a gaming card that was near identical in performance).

As such this leaves Nvidia the chance to either do what they have done in the past, which is release the new x80ti at the same prices as the x80 and drop the x80 price.... or do what they seem more likely to do nowadays and increase the x80ti price to somewhere between the 3080 and 3090, likely $900-$1000.

 

~$700 (in todays currency) are not uncommon as we can see, but above that is unacceptable.

 

Hope this clears up the debate over price.

 

I am not sure if these prices are entirely accurate (maybe they refere to price prediction before launch from some websites and not to the actual market price average) 

 

 

I was a hardware tinkerer in all those generations and clearly remember the prices be lower than those(from GTX  480 and back) for at least 100$ (<-- without counting inflation) with 8800 GTX being an exception but it got heavily criticized for being that much expensive. (<--- something that doesnt happen nowadays by todays "tech journalists") which lead to lower prices for the the 9000 series (the 9800GX2 being the flagship there not the GTX but still costing around 350-400$) 

 

 

I also think some prices are lower for the newer generation considering nvidias founders edition graphics cards (which began since GTX 680 if I am not mistaken) 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Since I'm brushing up on this stuff anyway for obvious reasons...

 

2 hours ago, SolarNova said:

Launch Year ------- GPU ------------ Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -------$500 -----$685

2005 --- GeForce 7900 GTX -------$500 -----$665

2006 --- GeForce 8800 GTX -------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275   (no this isnt a typo)

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 Ti -----$700 -----$775

2015 --- GeForce GTX 980 Ti ---- $650 -----$710

2017 --- GeForce GTX 1080 Ti ----$700 ----$740

2018 --- GeForce RTX 2080 Ti ---$1200 ---$1230

2020 --- GeForce RTX 3090 ------$1500 --- $1500

 

Spot the outliers.

 

(...)

 

~$700 (in todays currency) are not uncommon as we can see, but above that is unacceptable.

If you're comparing against the RTX 3090, you missed a few:

 

2006 --- GeForce 7950 GX2 --- $600 --- $771

2007 --- GeForce 8800 Ultra --- $830 --- $1,037

2008 --- GeForce 9800 GX2 --- $666 --- $801

2009 --- GeForce GTX 295 ---- $499 --- $603

2011 --- GeForce GTX 590 ---- $699 --- $805

2012 --- GeForce GTX 690 ---- $999 --- $1,127

 

Still high, though not higher than some previous Titans, assuming it is one:

 

2013 --- GeForce GTX Titan (Kepler) --------- $999 ------ $1,111

2014 --- GeForce GTX Titan Black (Kepler) - $999 ------ $1,093

2014 --- GeForce GTX Titan Z (2x Kepler) --- $2.999 --- $3,282 (yes, three Kepler Titans)

2015 --- GeForce GTX Titan X ------------------ $999 ------ $1,092

2016 --- Nvidia Titan X (Pascal) ----------------- $1,200 --- $1,295

2017 --- Nvidia Titan Xp --------------------------- $1,200 --- $1,268

2017 --- Nvidia Titan V ----------------------------- $2,999 --- $3,170

2019 --- Nvidia Titan RTX ------------------------- $2,499 --- $2,533 (released December 2018, using 2019 inflation)

 

So about $200 - 400 higher than a "typical" Titan release depending on which era you consider typical.

 

RTX 3080 pricing is on the high end of the scale but not out of line for an 80-series card:

 

image.png

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, papajo said:

dont even make half of the revenue of gaming alone. 

 

I did't speak of revenue I spoke about number of chips.  When you are buying +30-40k chip in a bulk for a server you get great discount.

 

4 hours ago, papajo said:

Besides that those are different seagments and we are not discussing prices e.g for the quadro card.

 

Again you are missing that it's the same chip.  And a lot of professionals choose to use normal GTX/RTX cards due to price difference.    Simple economics, the larger customer base you have for a single product (chip) the more the chip is worth.  

 

Talked about this as work then I was reminded of that  is was only a few years (10?) you needed a separate streaming card if you where going to stream video.  This is yet another example of something build in to today's GFX that add value to the card.  

 

You are so hung up on YOUR perceived value of the product and you don't care about the market value of the product.  Everyone have different opinion about the value of the product but that don't matter because it's the market value that is the true value of a product.

 

Publilius Syrius got this right 1th century BC: Everything is worth what its purchaser will pay for it.

 

So even if you don't like the product value, it's still what it's worth because the product sells.

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Kroon said:

I did't speak of revenue I spoke about number of chips.  When you are buying +30-40k chip in a bulk for a server you get great discount.

 

revenue is all that matters in the terms of our discussion. 

 

Also you dont get any discount (at least by nvidia) for 40k bulk you can tell that from the prices of their HPC units which perform more or less as a complex of quadros that cost less than the sticker price for those units .

 

  

24 minutes ago, Kroon said:

Again you are missing that it's the same chip.  And a lot of professionals choose to use normal GTX/RTX cards due to price difference

 

And you are missing the point that this is the vast minority of people... the people who are small (in terms of business capital) enough to consider buying the GTX versions but large enough to buy the top line of them are like a minority of a minority most GPUs (RTX/GTX) are going to gamers and enthusiasts and consumers in general most sales revenue comes from cheaper GPUs anyway (since most people dont have the top tier ones) . 

 

But I think up to this point it makes no sense to continue responding to your quotes since you seem probably to be a nvidia fandboy try to find a small detail and expand on that then change the conversation to an other detail anything to keep on defending your position... I mean we started arguing about nvidia overpricing their gaming GPUs and you (after various other U turns) ended up mentioning that if you make a 40k bulk order for a server you will get a discount as if that is related somehow... 

 

 

Its ok keep on paying 1000$+ for a gaming GPU if you love wasting your money that much and if you indeed are one of the less than 1% that buys them. I will keep on insisting that more than 1% of the people should be able to afford the top line consumer grade graphics card and that it is madness to spent upwards of 50% of your entire PC budget only for a single component namely the graphics card. 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, GabenJr said:

Since I'm brushing up on this stuff anyway for obvious reasons...

 

If you're comparing against the RTX 3090, you missed a few:

 

2006 --- GeForce 7950 GX2 --- $600 --- $771

2007 --- GeForce 8800 Ultra --- $830 --- $1,037

2008 --- GeForce 9800 GX2 --- $666 --- $801

2009 --- GeForce GTX 295 ---- $499 --- $603

2011 --- GeForce GTX 590 ---- $699 --- $805

2012 --- GeForce GTX 690 ---- $999 --- $1,127

 

Still high, though not higher than some previous Titans, assuming it is one:

 

2013 --- GeForce GTX Titan (Kepler) --------- $999 ------ $1,111

2014 --- GeForce GTX Titan Black (Kepler) - $999 ------ $1,093

2014 --- GeForce GTX Titan Z (2x Kepler) --- $2.999 --- $3,282 (yes, three Kepler Titans)

2015 --- GeForce GTX Titan X ------------------ $999 ------ $1,092

2016 --- Nvidia Titan X (Pascal) ----------------- $1,200 --- $1,295

2017 --- Nvidia Titan Xp --------------------------- $1,200 --- $1,268

2017 --- Nvidia Titan V ----------------------------- $2,999 --- $3,170

2019 --- Nvidia Titan RTX ------------------------- $2,499 --- $2,533 (released December 2018, using 2019 inflation)

 

So about $200 - 400 higher than a "typical" Titan release depending on which era you consider typical.

 

RTX 3080 pricing is on the high end of the scale but not out of line for an 80-series card:

 

image.png

Thats assuming the 3090 is a Titan, which is fine as i explained, it just means we need to expect a 3080ti as the 3080 doesnt seem to be inline with previous generations in terms of performance parity with a Titan of that generation.

 

Many of those GPU u listed were dual GPU cards, or overclocked versions of the card below (hence why i dint include them), neither scenario of which are comparable to a 3090 which is its own GPU.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, SolarNova said:

Many of those GPU u listed were dual GPU cards, or overclocked versions of the card below (hence why i dint include them), neither scenario of which are comparable to a 3090 which is its own GPU.

That seems pretty arbitrary if the goal was to list the top-tier cards for each generation.

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, papajo said:

But I think up to this point it makes no sense to continue responding to your quotes

At least something we can agree upon!  You obviously do not understand simple business economics and avoid the whole discussion about market value so there is nothing to discuss.

 

13 minutes ago, papajo said:

you seem probably to be a nvidia fandboy

Well I don't have any nVidia products at the moment so I have a hard time to figure that one out.  Been running on integrated graphics from AMD or Intel for some time now you know why?

 

Because my perceived value of ANY graphics card are lower then the market value so therefore I do not buy them.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GabenJr said:

Since I'm brushing up on this stuff anyway for obvious reasons...

 

If you're comparing against the RTX 3090, you missed a few:

 

2006 --- GeForce 7950 GX2 --- $600 --- $771

2007 --- GeForce 8800 Ultra --- $830 --- $1,037

2008 --- GeForce 9800 GX2 --- $666 --- $801

2009 --- GeForce GTX 295 ---- $499 --- $603

2011 --- GeForce GTX 590 ---- $699 --- $805

2012 --- GeForce GTX 690 ---- $999 --- $1,127

 

Still high, though not higher than some previous Titans, assuming it is one:

 

2013 --- GeForce GTX Titan (Kepler) --------- $999 ------ $1,111

2014 --- GeForce GTX Titan Black (Kepler) - $999 ------ $1,093

2014 --- GeForce GTX Titan Z (2x Kepler) --- $2.999 --- $3,282 (yes, three Kepler Titans)

2015 --- GeForce GTX Titan X ------------------ $999 ------ $1,092

2016 --- Nvidia Titan X (Pascal) ----------------- $1,200 --- $1,295

2017 --- Nvidia Titan Xp --------------------------- $1,200 --- $1,268

2017 --- Nvidia Titan V ----------------------------- $2,999 --- $3,170

2019 --- Nvidia Titan RTX ------------------------- $2,499 --- $2,533 (released December 2018, using 2019 inflation)

 

So about $200 - 400 higher than a "typical" Titan release depending on which era you consider typical.

 

RTX 3080 pricing is on the high end of the scale but not out of line for an 80-series card:

 

image.png

Inflation is only an academic indicator (suits to use it if your position is to defend an argument such as "nvidia prices are cheap" ) product prices not only in graphics cards but in general most often than not do not adjust for inflation directly even if you focus on nvidia the proof of that  is the list provided above by @SolarNova (which I am going to correct with sources because some prices are steaper than the real ones, but a few hours after posting this reply to you since I dont have the time to do it right now) 

 

we see for example that a 2003 top tier product (GeForce FX 5950 Ultra) costs 500$ while a 2011 top tier product (GTX 580) cost 500$ as well  (we see that in many other cases in the list but I just picked one) which means Nvidia did NOT adust its top tier product price for inflation (and this is common practice) 

 

because 500$ in 2003 (taking inflation into consideration)  would have been 611$ in 2011 yet the 2011 GTX 580 sticker price was 500$ so no adjustment for inflation had been taken... it just suits one needs to mention inflation if he wants to defend higher prices because inflation well... inflates price tags... 

 

Also besides being academic it is very hard to calculate how inflation really plays a part in product sticker prices because besides that there are many other variables (Coutrny GDP and per capita income, prices of raw materials, financial state of the company etc etc etc) inflation is currency and country specific so the same adjustment you just made above would not apply to the european market for example. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, papajo said:

we see for example that a 2003 top tier product (GeForce FX 5950 Ultra) costs 500$ while a 2011 top tier product (GTX 580) cost 500$ as well  (we see that in many other cases in the list but I just picked one) which means Nvidia did NOT adust its top tier product price for inflation (and this is common practice)

2011's top tier product was the GTX 590 at $599, making the FX 5950 Ultra roughly the same price in 2011 dollars. You can argue that we should still only be paying about that much (~$700 today - So RTX 3080) for top-tier performance instead of expanding the line with Titans, but you can't really call it inconsistent. I don't intend to get bogged down in the argument here, so I'll leave it at that.

Emily @ LINUS MEDIA GROUP                                  

congratulations on breaking absolutely zero stereotypes - @cs_deathmatch

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, GabenJr said:

2013 --- GeForce GTX Titan (Kepler) --------- $999 ------ $1,111

2014 --- GeForce GTX Titan Black (Kepler) - $999 ------ $1,093

What’s the difference?

Hi

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

hi

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GabenJr said:

That seems pretty arbitrary if the goal was to list the top-tier cards for each generation.

I disagree.

A dual GPU card is just 2 cards in one. You not comparing 1 card vs the other, ur comparing 2 of one to another. The price is ofc going to be higher because ur paying for 2 GPUs.

 

As for those old overclocked GPU's, they were the same GPU, just overclocked. If the 3090 was a much higher clocked 3080, then sure u could include them, but its not.

 

As im sure yoru well aware, when doing a comparison of anything you try your best to keep things as 'apples to apples' as possible.

 

If we dont put in exclusion you might as well just include every single card made right up and including the most powerful quadro cards, and including OEM only cards, and other limited run options that few people could ever get a hold of.

 

Perhaps a better header for the list would be.

"Top-teir Single GPU 'Gaming' cards" (thus excluding Titans, and dual GPU cards)

 

You can then decide if you think its fair to include those odd cards that popped up over the years that were just higher clocked versions of the card below, like the 8800 Ultra. That card in particular didnt do well because people knew they could just buy a 8800 GTX and overclock it themselves and save like $300+.

 

Then also decide if the 3090 deserves to be in the list.

Here i would argue that since we cant put it past Nvidia to release a Titan later down the road, and that based on past Titan vs Top gaming card examples the 3080 isnt fast enough vs the 3090 to be considered the real top gaming card vs the so called Titan (3090), we either have a situation where the 3090 IS a gaming card and NOT a Titan (thus overpriced and SHOULD be in the list), OR the 3090 IS a Titan (thus should NOT be in the list) but Nvidia has a hidden 3080ti yet to be released, in which case the 3080 is overpriced and they 'should' (probably wont) introduce the new 3080ti to replace the 3080 at its current MSRP, and drop the price of the 3080. (they did this in the 700 series when they released the 780ti)

 

Hope all that makes sense.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, Drama Lama said:

What’s the difference?

probably like a higher clock speed, and better binning.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, GabenJr said:

Since I'm brushing up on this stuff anyway for obvious reasons...

 

If you're comparing against the RTX 3090, you missed a few:

 

2006 --- GeForce 7950 GX2 --- $600 --- $771

2007 --- GeForce 8800 Ultra --- $830 --- $1,037

2008 --- GeForce 9800 GX2 --- $666 --- $801

2009 --- GeForce GTX 295 ---- $499 --- $603

2011 --- GeForce GTX 590 ---- $699 --- $805

2012 --- GeForce GTX 690 ---- $999 --- $1,127

 

Still high, though not higher than some previous Titans, assuming it is one:

 

2013 --- GeForce GTX Titan (Kepler) --------- $999 ------ $1,111

2014 --- GeForce GTX Titan Black (Kepler) - $999 ------ $1,093

2014 --- GeForce GTX Titan Z (2x Kepler) --- $2.999 --- $3,282 (yes, three Kepler Titans)

2015 --- GeForce GTX Titan X ------------------ $999 ------ $1,092

2016 --- Nvidia Titan X (Pascal) ----------------- $1,200 --- $1,295

2017 --- Nvidia Titan Xp --------------------------- $1,200 --- $1,268

2017 --- Nvidia Titan V ----------------------------- $2,999 --- $3,170

2019 --- Nvidia Titan RTX ------------------------- $2,499 --- $2,533 (released December 2018, using 2019 inflation)

 

So about $200 - 400 higher than a "typical" Titan release depending on which era you consider typical.

 

RTX 3080 pricing is on the high end of the scale but not out of line for an 80-series card:

 

image.png

TY for some level headed analysis and not pants-sh*ting.  People seem to be wanting to not call it a Titan because it wasn't called that officially, even though it has stats, memory etc that are in line with Titan cards.  And the price has actually been brought down to where a Titan level enthusiast card should be.  Massive leaps above RTX titan performance (to be verified), price 1000$ less, and somehow this is met with pants-sh*tiing.  

 

People have to realize there will be a 3060 (probably) and 3070 that they can get great performance with for a reasonable price, back in line with the previous trend of generations.  This should be celebrated.  Esp since gaming can be anywhere from 1080p to 4k, there is something for everyone at this point.

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

Mmm graphically looking at the price of the annual XX80's over time isn't enough as not all 80's, even if they're labeled as such, are the same performance bump. What's needed is a scatter plot of the 80's with price and performance on the X and Y axes. Whether or not the 3080 series is that good for consumers will depend on how far above or below it lies relative to the trendline. But we don't have 3080's real performance numbers yet, and it's not easy to compare 'performance' over decades of evolving technology.

Link to comment
Share on other sites

Link to post
Share on other sites

@AdmiralKird Way ahead of you!

 

grafik.thumb.png.ee6b5bd3cc08c607247fdf741e8ed40f.png

 

This graph shows basically "bang for the buck", or in other words, raw FP32 GFLOPS per Dollar (inflation adjusted). It doesn't consider actual gaming performance and it doesn't take VRAM, memory bandwidth or power consumption into account. The orange line shows the expected growth of performance per Dollar over time. I chose 33% annually because it's a general rule of thumb for performance increase (around 30%/y) and it fits the data points well. I chose the GTX 480 as a baseline, it was launched about 10 years ago. The abscissa (x-axis) is scaled linear and shows the release date, the ordinate (y-axis) is scaled logarithmically (exponential growth).

The prices are inflation adjusted on par with @GabenJr .

 

My thoughts:

The 20 series cards were pretty much a disappointment and the data clearly shows why. The 6 series cards were actually a great value which also coincides with my memory.

 

 

Data sources:

inflation calculator: https://fxtop.com/en/inflation-calculator.php

FP32 perfomance, launch date, launch MSRP: https://www.techpowerup.com/gpu-specs/geforce-rtx-3080.c3621

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, GabenJr said:

2011's top tier product was the GTX 590 at $599, making the FX 5950 Ultra roughly the same price in 2011 dollars. You can argue that we should still only be paying about that much (~$700 today - So RTX 3080) for top-tier performance instead of expanding the line with Titans, but you can't really call it inconsistent. I don't intend to get bogged down in the argument here, so I'll leave it at that.

that is a dual GPU graphics card (and even then being sold lower than 611$ which would be the inflated price) which did not even outperform the GTX 580 in many games the FX 5950 ultra is a single GPU graphics card make an apples to apples comparison

 

I am pretty disappointed in this response because you ignored all the other examples  (e.g 6800 ultra compared to GTX 480) and you ignored the main point of the post  that inflation is not something you can slap to a price tag to argue its the same you think you pay with prices adjusted to inflation for tomatoes or TVs or cars? 

And you mentioned an example that doesnt really apply (GTX 590) ..

 

which points to me that you are biased hence I am disappointed. 

 

Inflation is an academic indicator + is only country and currency specific only if you want to misinform and defend a position no matter what you would use such an argument at least if you didnt know that but in that case I made you aware of that yet you still insist.

 

 

A pentium top line 166mhz pentium in early 1993 would cost you 620$  while a top line 500Mhz Pentium III would cost you 650$ inflation would make those 620$ of 1993 714$ in 1999 yet it was 64$ cheaper than that (and a vew months later the top pentium III price was discounted to about 350-400 $) 

 

Inflation is not an argument it's only a  "look I can make the price seem to be bigger and this suits my argument"  nothing in the real world adjust directly for inflation it is an academic indicator and not a rule or law 

 

 

There are many factors that change/affect the pricetag of a product much more severely than inflation e.g supply and demand. 

 

On 2001 nvidia had far fewer orders and graphics card distribution logistics where more primitive and expensive than compared to to 2011 or today which mean they should be cheaper even if taking inflation into account. 

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, HenrySalayne said:

My thoughts:

The 20 series cards were pretty much a disappointment and the data clearly shows why. The 6 series cards were actually a great value which also coincides with my memory.

While in the real world it was the other way around people praised the GTX 200 series and were being disappointed about the GTX 600 series  

 

which is one example as to why manipulating numbers to showcase something is called " cooking the books "or "Creative accounting" 

 

Why should GF / $ be a serious indicator? as for inflation I already mentioned that it is mostly an academic indicator. 

 

 

Obviously the GF should be better each year especially since we are talking about computer electronics which one of its characteristics is that each new generation has vastly better performance in many occasions more than double the performance than the previous generation 

 

So checking if you pay about the same for the extra performance is futile, a huge performance bump should be (And is) a given in pc electronics  prices on the other hand should (and usually are) stay the same if not getting cheaper as more people adopt said products. 

 

To put it into a kind of different context the  current pentium CPU is like 1 million times faster than the pentium 1 back in 1993 this doesnt mean that intel has cheap prices because their products today do not cost 1.000.000$ dollars (to reflect the performance difference compared to the past) 

 

Lets not forget that this extra performance is FREE performance and just intellectual property a picture on a silicon surface changes they dont use more gold or more manpower or what ever to achieve that .. they just tweak the degisn. 

Link to comment
Share on other sites

Link to post
Share on other sites

Only skimmed this thread. Argument doesn't seem to be very useful. We have to buy in the "now". Doesn't matter what we had in the distant past. What gets you the best system for your needs? The 3080 seems to be great for a high end gaming build, 3070 not bad either for a tighter budget. Not low end options by any means but you can still use old generation to fill that area. Until next navi comes out, there isn't really any other choice. 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, papajo said:

While in the real world it was the other way around people praised the GTX 200 series and where being disappointed about the GTX 600 series  

 

which is one example as to why manipulating numbers to showcase something is called " cooking the books "or "Creative accounting" 

 

Why should GF / $ be a serious indicator? as for inflation I already mentioned that it is mostly an academic indicator. 

 

 

Obviously the GF should better each year especially since we are talking about computer electronics which one of each characteristics is that each new generation beings vastly better performance in many occasions more than double the performance than the previous generation 

 

So checking if you pay about the same for the extra performance is futile, a huge performance bump should be (And is) a given in pc electronics  prices on the other hand should (and usually are) stay the same if not getting cheaper. 

I seriously don't have a response to that. It is your choice to maintain your opinion and ignoring any objective perspective.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, papajo said:

I will keep on insisting that more than 1% of the people should be able to afford the top line consumer grade graphics card

Why tho. There's no reason for this, other than "It would be really nice."

Having the "top line GPU" isn't nearly as important as something like "Internet access" or "mobile phone access".

You could make the argument that the latter two are so important and intrinsic to daily life in a 1st-world country that we *should* insist that people be able to afford them.

 

Don't get me wrong, I love games and I think alot of people can derive a great amount of joy from them.

BUT

They certainly aren't required to own a brand-new flagship to do so.

 

As for

18 hours ago, papajo said:

it is madness to spent upwards of 50% of your entire PC budget only for a single component namely the graphics card. 

I don't think anyone is recommending you do that, lol.

If you're putting a 3090 in a system where the entire rest of the system only cost you $1400, you're probably looking at a mismatched system. I would posit that 30-40% would be a better percentage to work with.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, HenrySalayne said:

I seriously don't have a response to that. It is your choice to maintain your opinion and ignoring any objective perspective.

Its not my opinion these are facts. 

 

 

You just chose an arbitrary increasing indicator and divided this per dollar... why should this give any sort of productive inference? 

 

And if according to you the more a tech indicator increases the more the price should increase (or if its increasing linearly then everything should be ok) 

 

Then for example (following your logic/narrative) a pentium 1 (p5)@ 166 mhz  which cost back in its day 620$ had about 30 Mflops in performance a core i9 10900k performs 706GFlops  or in other words about 24.000 times faster than the pentium 1 in terms of flops so I repeat according to your logic/narrative as long as the price is up to 24.000 times the prices of the p1 then the price would be a fair one..

 

So let sell those 10900k CPUs  @ 20.000* 620$ (I do not even account for inflation see how generous i am? ) = 12.400.000$ its a bargain price isnt it?

 

 

24k times the performance but only 20k times the price WOW and not even inflation is accounted for so its even cheaper!!! :D Great prices I wonder why intel doesnt sell them at this price. 

 

 

Not to mention that more demand and bigger supply pipelines etc should make it even cheaper. 

 

 

But you can keep on trying to use every trick to inflate prices (literally or figuratively) if you like it to be so. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, papajo said:

Its not my opinion these are facts. 

No. If you can't supply any objective evidence, it is just your opinion.

 

1 hour ago, papajo said:

You just chose an arbitrary increasing indicator and divided this per dollar... why should this give any sort of productive inference? 

As I mentionend already:

2 hours ago, HenrySalayne said:

It doesn't consider actual gaming performance and it doesn't take VRAM, memory bandwidth or power consumption into account.

The raw FP32 performance is an objective and measurable value across many generations of graphics cards. It's not real world performance, but it's a good estimate.

1 hour ago, papajo said:

And if according to you the more a tech indicator increases the more the price should increase (or if its increasing linearly then everything should be ok) 

Where and when did I say that? I don't want to be condescending, but did you understand the graph I've provided or read the attached text?

1 hour ago, papajo said:

Not to mention that more demand and bigger supply pipelines etc should make it even cheaper. 

No, it doesn't. There is a lower limit of price per unit. I'm pretty sure there is virtually no difference in the production costs for 100,000 or 500,000 graphic cards.

1 hour ago, papajo said:

But you can keep on trying to use every trick to inflate prices (literately or figuratively) if you like it to be so. 

My graph clearly shows your claims are just wild speculation. Despite the 20 series of graphic cards, prices (or to be more specific, prices for FP32 performance) are pretty much where they should be.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×