Jump to content

Rumored Price of RTX 4070 starting at 750 USD

starsmine
2 hours ago, SolarNova said:

Entitlement and have principles are two entirely different things.

 

Someone who is entitled tends to act that way across their entire life. being selfish, unappreciative, and blames others for their own issues.

 

Being principled however is to have a set of morals, codes, to ensure things stick to a standard and being ethical. For a 'thing' to stick to an idea or rule of how it works etc.

 

Being principled doesn't mean being entitled, equally so to not be principled doesnt mean being entitled. They are entirely different.

 

Just because a person expects and wants to ensure a thing stays the same as has been proven to work fine in the past, instead of becoming worse for one party and better for another, isnt entitled, its principled. We have over two decades of GPU pricing structure to show that the past few years are anomalous, benefiting the companies at the detriment to the 'average' consumers.

 

I agree with your distinction between principle and entitled, but I'd agree with Dizmo that principle still isn't the right word in this situation. It has some connotation of inherent or objective right or wrong. And I assume you do believe it's objectively wrong, but I don't think principles, much like people who try to bring morality into such matters, has any place in the discussion. It's business. It's markets, and it's really that simple. 

 

I don't have to always like it. And that's not to say that business can't overlap with morals, if we're talking about something a little more life and death, like medicines etc.. 

 

But I don't think principles and morals are really at play when buying a videocard. Actually, as I type that, I kinda take it back. You can have your personal principles obviously.... but you can't say someone else has no principles if they disagree. They just don't have the same principles as you in this situation, which isn't objectively better or worse. But to say "If you buy that, you have no principles" seems just overly judgmental and insulting.

Link to comment
Share on other sites

Link to post
Share on other sites

Wow a third tier card for what should be a flagship price, definitely skipping this one.

Hope Nvidia bleeds enough from this joke of a generation's price gauging and fixes the value for 50xx.

The best gaming PC is the PC you like to game on, how you like to game on it

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, Holmes108 said:

but you can't say someone else has no principles if they disagree. They just don't have the same principles as you in this situation, which isn't objectively better or worse. But to say "If you by that, you have no principles" seems just overly judgmental and insulting.

Ok fair point.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, SolarNova said:

Ok fair point.

 

All that being said, I think the pricing has been absolutely atrocious lol.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, SolarNova said:

We have over two decades of GPU pricing structure to show that the past few years are anomalous, benefiting the companies at the detriment to the 'average' consumers.

I think we should not discuss this general topic in this thread - but here I am doing it anyway... ¯\_(ツ)_/¯

 

With 2160p+ resolutions, HDR, raytracing and high FPS displays, a lot came together within rather short time. DLSS is a symptom of this development, because traditional hardware could not keep up. Manufacturing costs of a 4090 are so high, it's impossible to sell these cards for a similar price like the 1080 TI. Some people need to manage their expectations. Jensen wasn't wrong when he said that the time of cheap GPUs is over.

Which doesn't mean Nvidia has almost no margin and GPUs need to be so expensive. They still have a very healthy margin. And they will milk the fact that they are on top as much as they can. And even with the low demand of the current market, the pressure from AMD is quite low. Several people throughout this thread asked if they should get a 7900XT or a 4070 TI and nobody said "you must get the 7900XT" - maybe we should.

IMHO without external pressure Nvidia will not give up on their margin even with the volumes as low as they are. AMD is probably quite happy about their position, so it's unlikely that this is going to change in the near future.

 

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, HenrySalayne said:

They still have a very healthy margin. And they will milk the fact that they are on top as much as they can.

One thing to consider is that most people still play at 1080p 60-144hz, and at that resolution even rx 480 and gtx 1060 (6gb) are still holding on, only now becoming minimum requirements for some of the most demanding games.

So when you have people upgrading their GPU's less often, the companies need those higher margins to compansate.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/16/2023 at 3:39 PM, Zodiark1593 said:

Nvidia has pretty much moved to “No Mercy” mode. 

You had better THANK Nvidia for still making GPUs!!!

No, seriously. GPT hardware *is* Nvidia. They could, right now, stop all GPU fabrication and pour all R&D in AI acceleration and bank more than ever.

Microsoft, Google, Amazon, all the big players are looking to deliver dump-trucks of money to Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/17/2023 at 9:15 AM, llanelwy said:

i have been waiting for the 4070 or order an new gfx card this price is just silly though. In the uk 4070ti are about £860 cant find any near MSRP of 799.

 

However i can easily find and 7900XT for £799

 

I play at 1440p, It would be nice to have a card that can do ray tracing. so was leaning for nvidia. but now im wondering if i should go AMD. just as nvidia cards here are taking the michael. anyone got any realworld experiance with how the ray tracing is at 1440p with these two cards to help be decide?

How much do you want ray tracing? There are few games that proper take advantage of it right now.

 

Unless you're into a game that uses it well, like Cyberpunk, I'd definitely go team red this gen. And I say that as someone currently running a 3080ti, my third Nvidia card in a row, so it's not like I'm a rabid AMD fanboy. Just give the reference cooler a wide berth.

Link to comment
Share on other sites

Link to post
Share on other sites

Looks like with Nvidia's newfound

On 3/18/2023 at 3:16 AM, StDragon said:

You had better THANK Nvidia for still making GPUs!!!

No, seriously. GPT hardware *is* Nvidia. They could, right now, stop all GPU fabrication and pour all R&D in AI acceleration and bank more than ever.

Microsoft, Google, Amazon, all the big players are looking to deliver dump-trucks of money to Nvidia.

Gaming cards are still a major profit center. Literal billions. Roughly the same amount as data center profits. No shareholder is going to be pleased by a company throwing that away, especially as AI could be gone in a few years too. Not in the direct sense, but ASICs might take the market.

 

AI means Nvidia is lucking out again, but that doesn't mean they can just neglect a huge and stable market. We've all seen how quickly things can turn around when you're into the hip thing of the moment.

 

 

Pic.png

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/18/2023 at 3:16 AM, StDragon said:

You had better THANK Nvidia for still making GPUs!!!

Nvidia isn't making GPUs out of the goodness of their heart. Gaming GPUs demand is consistent and a major revenue stream for Nvidia, and it will remain so for the foreseeable future. Other clients, come and go, gaming demand, has always been there for Nvidia, which is also why Nvidia doesn't really care about keeping gamers happy, they know they can always count on people wanting to game.

The 4000 series pricing is the hangover from ETH mining, same as the 2000 series. It's not a good, nor a sustainable businness model not to sell products to keep prices up.

We can hope for the 5000 series to have sensible prices. Having a 3080, at higher price, two years later, is not sensible pricing.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/19/2023 at 8:07 AM, Monkey Dust said:

How much do you want ray tracing? There are few games that proper take advantage of it right now.

 

Unless you're into a game that uses it well, like Cyberpunk, I'd definitely go team red this gen. And I say that as someone currently running a 3080ti, my third Nvidia card in a row, so it's not like I'm a rabid AMD fanboy. Just give the reference cooler a wide berth.

you're right about raytracing, to me its a useless gimmick,  but NV still has the upper hand in recording/  streaming  afaik?  that's the big one for me

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Not worth better get 4070 Ti and stop trash talk that this GPU can't run 4K... There plenty trash video about how bad is 4070 Ti  of couse it can't run 4K native + all settings max including RT. In reality this card is very great have one and test self at 4K games it run very well with max setings and DLSS. I have no problem run even Hogwarts Legacy 4K Ultra + DLSS and frame generation no problem hit ~70-120 depends on area. People need a bit understand that this GPU is not for 4K actually and if you want use 4K you will must use DLSS and frame generation if game support it otherwise you get very bad FPS at 4K in most new games, Another thing why people have bad FPS is could be due they not enable resizable bar in BIOS. Intel ARC is good example how this GPU unseful without resizable bar enabled maybe 4xxx series is same. At 4K you can actually use even preformace DLSS that render still at 1080p and final result be very sharp no big diference compered to native 4K and on top of that frame generation make your FPS double. Looks that brute rendering is slowly count it's last days and Nvidia is good example that with AI can achieve same result with way less resource you don't need 20 Gb VRAM than you can render game at 1080p and upscale too 4K without actual quality lose due AI.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/19/2023 at 8:31 PM, XNOR said:

Gaming cards are still a major profit center. Literal billions. Roughly the same amount as data center profits. No shareholder is going to be pleased by a company throwing that away, especially as AI could be gone in a few years too. Not in the direct sense, but ASICs might take the market.

 

AI means Nvidia is lucking out again, but that doesn't mean they can just neglect a huge and stable market. We've all seen how quickly things can turn around when you're into the hip thing of the moment.

This breakdown is true however you have to factor in the the datacenter revenue is actually limited by supply. So lets say for example Nvidia stopped making all consumer gaming GPUs, potentially (not saying how likely but go with me here) the entire revenue would be made up by the increase in datacenter GPUs.

 

It could even be more profitable tor Nvidia to do that since they could stop making like 10 source dies and only make 3: Big, Medium, Small. Then pump those at out datacenter product tier pricing.

 

As for factors why they wouldn't do this, brand protection. Exiting the consumer market would be massive risk, almost guaranteed to cause business regression in the long run,

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

This breakdown is true however you have to factor in the the datacenter revenue is actually limited by supply. So lets say for example Nvidia stopped making all consumer gaming GPUs, potentially (not saying how likely but go with me here) the entire revenue would be made up by the increase in datacenter GPUs.

It could even be more profitable tor Nvidia to do that since they could stop making like 10 source dies and only make 3: Big, Medium, Small. Then pump those at out datacenter product tier pricing.

As for factors why they wouldn't do this, brand protection. Exiting the consumer market would be massive risk, almost guaranteed to cause business regression in the long run,

This is generally true, but right now fabs are not saturated with orders. Apple, AMD and Nvidia were lowering their fab allocations. TSMC probably has the capacity fullfill any spontaneous wishes.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, HenrySalayne said:

This is generally true, but right now fabs are not saturated with orders. Apple, AMD and Nvidia were lowering their fab allocations. TSMC probably has the capacity fullfill any spontaneous wishes.

Problem is increasing production of something isn't a spontaneous thing for anyone involved. A lot of the production Nvidia has cut is wafers used for laptop chips and other middle to low end desktop GPUs, but realistically mostly laptops.

 

Increasing production of something like a H100 or A100 is much more challenging compared to their other products, note there are other GPUs like the A30 that use the big die + HBM. These require an interposer and HBM neither of which are in massive supply. Then for lower tier products that are not HBM based they use, mostly, the largest density and capacity DRAM modules and have to pass ECC validation testing, so supply of DRAM suitable may not be there.

 

And then there is the most simple factor which is market price control, Nvidia wouldn't want to significantly increase H100/A100 production and supply diluting the value of those because what if the market situation changes. It's in a way ideal to be in a situation where every product you make sells and is out of stock but not to the point it's causing supply chain or customer problems. That way if the market situation does change you're pretty well insulated from it other then needing to respond with product changes.

 

The above point is the basic business lesson learned/re-learned during the first mining boom and why the response was so drastically different the second time around.

Link to comment
Share on other sites

Link to post
Share on other sites

Partial sidebar, but also related: I've always wondered, but the sheer number of graphics card SKU's have always seemed to me to be not only excessive, but borderline preposterous. I know more choice is generally better,  but is there really that much demand from consumers for THAT much choice?

 

I'd literally think you could get away with a Low, Medium and High performance. But I do recognize that might be a little simplistic. But to have essentially 5 basic tiers (xx50-xx90) and then start adding TI's, and Supers in there, and it almost feels like it's meant to confuse. It has to be more expensive for them to maintain too, no?

 

Or is it less about real/perceived market demand, and more about utilizing different quality chips? I know imperfect chips can become different model CPU's and GPUs. Are some of the different products actually just a direct result of that, rather than actively seeking extra SKUs? A chicken or egg thing I guess?

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Holmes108 said:

Or is it less about real/perceived market demand, and more about utilizing different quality chips? I know imperfect chips can become different model CPU's and GPUs.

Without the information around product yields I'd say this is quite a large factor in it. Also supply constraints play in to it as well. Say 1GB DRAM modules are in low supply due to transition to 2GB mass production, a double VRAM variant could be released. Or volume of sales is lower than expected on a larger die that has already been manufactured and paired with GDDRX rather than GDDR, create a new lower product based on that and microcode limit usable SMs etc.

 

You can see examples of such things looking through the GeForce Wikipedia pages and all the different SKUs and release dates along with what die and memory they use. Take a look at the RTX 3060/RTX 3060 Ti as a starting point.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Holmes108 said:

Or is it less about real/perceived market demand, and more about utilizing different quality chips? I know imperfect chips can become different model CPU's and GPUs. Are some of the different products actually just a direct result of that, rather than actively seeking extra SKUs? A chicken or egg thing I guess?

There might be a better example of that going on with AMD chiplet CPUs (excluding APUs). Essentially AMD only make one CPU chiplet per generation. To start with you could say there are two CPUs offerings based on one or two of these chiplets: 7700X with one, 7950X with two.

They have some duff cores, so let's make core reduced versions: 7900X, 7600X

Some don't clock so well? Add: 7900, 7700, 7600

Let's stick some more cache on too! Add: 7950X3D, 7900X3D, 7800X3D.

 

From manufacturing one base die, there's 10 products covering from 6 to 16 cores. For software that scales well, that's about a 2.5x to 3x peak execution difference from lowest to highest parts, depending on how turbo works. For more complex workloads the scaling could well be different.

 

 

Back to nvidia, current gen consumer they currently make 5 dies:

AD102 - 4090

AD103 - 4080, 4090 Laptop

AD104 - 4070 Ti, 4080 Laptop, rumoured 4070

AD106 - 4070 Laptop, future lower tier desktop parts?

AD107 - 4050 Laptop, 4060 Laptop, future lower tier desktop parts?

 

The peak execution difference between the 4050 Laptop part and 4090 desktop is about 9x, without multiple core dies to help here. So more silicon options are taken to best size at each point on the offering scale.

 

Since we don't know the eventual models that will be offered under 40 series, we can look at 30 series for some indication of how many products there may be. The desktop stack of 30-series to my count had about 12 versions. I'm counting only models offering unique core and/or memory configurations. This was during the mining era so I think they were more aggressively reusing any die to make a product, so we more often see partially defective bigger dies cut down to lower offerings than we otherwise might.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly just wait for what AMD releases they often give better deals, especially without ray tracing.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Arokhantos said:

Honestly just wait for what AMD releases they often give better deals, especially without ray tracing.

I mean its not like RDNA 3 is even bad at ray tracing, its just as good as ampere. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, starsmine said:

I mean its not like RDNA 3 is even bad at ray tracing, its just as good as ampere. 

Drivers also keep getting better, its bit of a rollercoaster but mostly avoidable by going either linux or not updating for a while, only downside is the lack of feedback AMD gets this way unless they do take note of users staying on older drivers.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, leadeater said:

Problem is increasing production of something isn't a spontaneous thing for anyone involved. A lot of the production Nvidia has cut is wafers used for laptop chips and other middle to low end desktop GPUs, but realistically mostly laptops.

 

Increasing production of something like a H100 or A100 is much more challenging compared to their other products, note there are other GPUs like the A30 that use the big die + HBM. These require an interposer and HBM neither of which are in massive supply. Then for lower tier products that are not HBM based they use, mostly, the largest density and capacity DRAM modules and have to pass ECC validation testing, so supply of DRAM suitable may not be there.

 

And then there is the most simple factor which is market price control, Nvidia wouldn't want to significantly increase H100/A100 production and supply diluting the value of those because what if the market situation changes. It's in a way ideal to be in a situation where every product you make sells and is out of stock but not to the point it's causing supply chain or customer problems. That way if the market situation does change you're pretty well insulated from it other then needing to respond with product changes.

 

The above point is the basic business lesson learned/re-learned during the first mining boom and why the response was so drastically different the second time around.

I agree that other components like memory could be a limiting factor. However, demand is currently lower than expected. IMHO the current situation makes it quite easy to squeeze in additional batches.

Manufacturers like TSMC generally want long term commitments from their customers, but they are still gladly selling free capacity if they have no other customers. I could see Nvidia temporarily increasing their output for data centre cards if the market has changed considerably and they would not endanger their sales in the coming years.

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/24/2023 at 2:08 PM, leadeater said:

This breakdown is true however you have to factor in the the datacenter revenue is actually limited by supply. So lets say for example Nvidia stopped making all consumer gaming GPUs, potentially (not saying how likely but go with me here) the entire revenue would be made up by the increase in datacenter GPUs.

 

It could even be more profitable tor Nvidia to do that since they could stop making like 10 source dies and only make 3: Big, Medium, Small. Then pump those at out datacenter product tier pricing.

 

As for factors why they wouldn't do this, brand protection. Exiting the consumer market would be massive risk, almost guaranteed to cause business regression in the long run,

Not just brand protection, but the most important of all (at least according to themselves): shareholders. They don't like ditching known profit for uncertain endeavors, even if it pays initially.

Link to comment
Share on other sites

Link to post
Share on other sites

$750 is steep for a 70 card, it will be interesting to see it go up against the 30 series lineup. What a mess this generation has been. 

 

Problem is for us in the UK/EU we get a premium charge on top of it making the situation worse. 

 

Lets hope the launch of this card lowers the 30 series cards by some margin so that it can become more affordable. Over here in the UK 30 series cards are way over prcied, even second hand as well.  

 

  • RTX 3090 is going for around £1,500 / $1,830 refurbished and £750 / $920 pre-owned (Hard to find brand new units these days majority of stock has gone)
  • RTX 3080Ti is going for over £1,000 / $1,220 and £730 / $890 pre-owned
  • RTX 3080 is going for £720 / $880 and £500 / $610 pre-owned
  • RTX 3070 is going for £550 / $670 and £360 / $440 pre-owned

All prices include VAT aka TAX

 

When you're paying that much for a card as well, second is pretty risky imo I just hope the market gets better. 

 

The unfortunate thign is that its most likely gonna be Nvidia holding off until the stock for 30 series cards have pretty much gone then the launch of the 40 series lower tier cards be released to market. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 3/20/2023 at 10:50 AM, Mark Kaine said:

you're right about raytracing, to me its a useless gimmick,  but NV still has the upper hand in recording/  streaming  afaik?  that's the big one for me

There's a lot of RTX mods out there. PrBoom (modded classic Doom) and the original Quake gets ray tracing too. It's not a useless gimmick when it brings new life to old games.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×