Jump to content

NVIDIA GeForce RTX 4090 Benchmark Leak Shows A 65% Gain Over RTX 3090 Ti (Updated)

If you have a 3000 series card, the 4090 is the only card worth getting really. I just hope I can grab the model I want for MSRP 😕

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Noble3212 said:

To be completely honest. I'm not buying a 4090. I've heard during testing the power connection started to melt. Plus it's more expensive than AMD response so. Looks like team red might win this 

No, No and more No.   Amd upcoming slower gpu has a 450 tdp just like Nvidia  and same connector.  Try again?

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Shzzit said:

No, No and more No.   Amd upcoming slower gpu has a 450 tdp just like Nvidia  and same connector.  Try again?

Neither have been reviewed, neither is faster than the other 🙃

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

Neither have been reviewed, neither is faster than the other 🙃

I disagree. Nvidia 's GPUs are faster*.

 

Spoiler

* to hit the market

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, HenrySalayne said:

I disagree. Nvidia 's GPUs are faster*.

 

  Reveal hidden contents

* to hit the market

 

Ah yes but AMD will be faster to run out of stock

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, HenrySalayne said:

I disagree. Nvidia 's GPUs are faster*.

 

  Reveal hidden contents

* to hit the market

 

hahaha 8)

CPU:                       Motherboard:                Graphics:                                 Ram:                            Screen:

i9-13900KS   Asus z790 HERO      ASUS TUF 4090 OC    GSkill 7600 DDR5       ASUS 48" OLED 138hz

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/3/2022 at 3:45 AM, leadeater said:

Stock, if you believed the rumors of the RTX 4090 being 2x-4x faster than last gen then you needed to lay off breathing in the LN2 🙃 lol

Come on leadeater, you are better than this.

Nobody has said that the 4090 will be 4 times as fast as the 3090-whatever in all benchmarks.

The x4 claim is specifically about ray tracing performance, and I would not be surprised if that is more or less true.

The x2 rasterization performance is probably true as well, but only in certain scenarios. Which scenarios remains to be seen. It might be in most scenarios. It might only be in 4K or even 8K gaming. 

 

 

15 hours ago, Noble3212 said:

To be completely honest. I'm not buying a 4090. I've heard during testing the power connection started to melt. Plus it's more expensive than AMD response so. Looks like team red might win this 

1) Source on the power connector melting, please.

2) We don't even know what AMD's competitor looks like so I think it's a bit unwise to say that AMD has won already, when we don't actually know the performance of this card, and neither the performance nor price of AMD's card. 

 

 

  

10 hours ago, HumdrumPenguin said:

If you have a 3000 series card, the 4090 is the only card worth getting really. I just hope I can grab the model I want for MSRP 😕

I'd argue that if you got a 3000 series card then you shouldn't be buying something new at all. People need to stop being brainwashed by marketing and influencers into thinking "buying things" is a hobby.

Just keep using what you already got. I promise you that spending 1000+ dollars on a new graphics card to go from 160 FPS to 200 FPS won't make the game you play more enjoyable. You get a small dose of dopamine when you buy the thing and then it's gone in one week when whichever influencer you follow start hyping up the next gadget.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, LAwLz said:

Come on leadeater, you are better than this.

Nobody has said that the 4090 will be 4 times as fast as the 3090-whatever in all benchmarks.

The x4 claim is specifically about ray tracing performance, and I would not be surprised if that is more or less true.

The x2 rasterization performance is probably true as well, but only in certain scenarios. Which scenarios remains to be seen. It might be in most scenarios. It might only be in 4K or even 8K gaming. 

lol nope, all of these rumors are exactly as bunk as the statement I made here.

 

1 outlier does not make it a real thing, so x4 will not be true, 2x will not be true, none of it will be true.

 

If you can show it doing it in 10 games i.e. consistency then you get to make such a claim otherwise I'll lambast it and you can whine all you want it changes nothing. 

 

These sucking up rumors and over hyping to x1000 simply needs to stop, it's not productive nor realistic for anyone. You are better than this, you should have the exact same objectivity or you are doing yourself a disservice, just some advice.

 

Anything more than 80% gen on gen gains in the realm of 2 years necessitates a stop and think mentality, it's almost certainly not true. At least this rumor is actually realistic, as they tend to get closer and closer to product release.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, leadeater said:

lol nope, all of these rumors are exactly as bunk as the statement I made here.

 

1 outlier does not make it a real thing, so x4 will not be true, 2x will not be true, none of it will be true.

 

If you can show it doing it in 10 games i.e. consistency then you get to make such a claim otherwise I'll lambast it and you can whine all you want it changes nothing. 

 

These sucking up rumors and over hyping to x1000 simply needs to stop, it's not productive nor realistic for anyone.

2x rasterization performance and 4x ray tracing performance are not rumors...

 

I will probably be able to get back to this topic a couple of days after reviews has dropped, with some 5-10 gaming benchmarks showing roughly 2x rasterization performance and roughly 4x raytracing performance.

But you have to promise me to not backpedal or move the goalpost, okay? No "okay you showed me benchmarks but I don't think they should count because..." or that kind of response. I would be very disappointed if I went through effort to prove you wrong only for you to not admit to being wrong and then ignoring evidence that contradicts some predefined conclusion.

 

 

The 4090 will be compared against the 3090 Ti.

Please note that DLSS 3 might be used for the 4090 while the 3090 Ti might have to use DLSS 2 since it is not supported.

I don't think this is "cheating" since it is the same situation as newer CPUs supporting newer instructions such as AVX2. You don't kneecap the newer stuff when making comparisons, you compare them both when they are at their best. Does that seem logical and fair to you?

 

 

 

14 minutes ago, leadeater said:

Anything more than 80% gen on gen gains in the realm of 2 years necessitates a stop and think mentality, it's almost certainly not true. At least this rumor is actually realistic, as they tend to get closer and closer to product release.

So you think 80% gain is realistic, but 100% gain is completely unrealistic? Remember, 2x is a 100% increase in performance. 4x is a 300% increase in performance.

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, LAwLz said:

2x rasterization performance and 4x ray tracing performance are not rumors...

 

Quote

Gaming performance in the latest titles rockets by up to 2X, and by harnessing DLSS 3 and new Ada innovations, developers can boost performance by up to 4X in fully ray-traced titles. In creative apps, GeForce RTX 40 Series graphics cards provide up to 2X the performance in 3D rendering, video export speed, and AI tools.

First "Gaming performance" != rasterization so not 2x rasterization (this likely is however again be weary, also up to 2x is not the same thing as saying 2x improvement over last gen), this has only ever been a rumor claim and not a claim statement by Nvidia. As for the 4x that Nvidia says it themselves real well.

 

Unless you are isolating the specific test claim and rendering the same thing then you cannot blanket 4x performance increase like this, because it will not be true.

 

If you can get a collection of 10 games once the RTX 40 series releases with ONLY RTX Ray Tracing enabled and no DLSS so native vs native then I will agree there has been a 4x increase in RT performance.

 

(RT + DLSS 2 or 3) != RT

 

Again, outliers are not particularly relevant, while one game could achieve said claim Nvidia has actually made that's not a supporting argument to say RTX 40 series has improved by that much compared to RTX 30 series. If we are going to accept edge cases as objective truths then AMD GPUs have been the performance flagships multiples times over, which isn't true at all.

 

"Up to 2x" and "2x faster" are not the same thing, one I expect a reasonable average across a range to achieve it not a singular. Wording matters.

 

41 minutes ago, LAwLz said:

So you think 80% gain is realistic, but 100% gain is completely unrealistic? Remember, 2x is a 100% increase in performance. 4x is a 300% increase in performance.

No I think there has to be a line where you must automatically raise skepticism, 80% feels about right for rumors.

 

Edit:

Add some clarification if you REALLY want to bother coming back to this.

Link to comment
Share on other sites

Link to post
Share on other sites

On 10/3/2022 at 1:32 AM, Brandi93 said:

If this is scaled throughout all the RTX 4xxx Series, and for example an RTX 4060 is 1.6x more powerful than an RTX 3060, it might mean that lower end cards will no longer be 1080p cards but 1440p and even 4K for Ti versions! 

Which is kind of a moot point if "lower end" cards still hit the market at $800 or more.

 

Also bear in mind that the 4080 is heavily scaled down from the 4090 so it may not be quite as linear.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, leadeater said:

First "Gaming performance" != rasterization so not 2x rasterization, this has only ever been a rumor claim and not a claim statement by Nvidia. As for the 4x that Nvidia says it themselves real well.

When people say rasterization vs ray tracing on this forum, they are talking about gaming performance.

Nvidia's claims are that the 4090 will get up to 2x gaming performance in titles that do not use ray tracing, compared to the 3090 Ti. That might be combined with the improvements brought by DLSS 3.

 

I feel like you are are once again in a situation where you use different definitions of words compared to what others use, and it results in you talking past the person you are talking to.

 

 

15 minutes ago, leadeater said:

Unless you are isolating the specific test claim and rendering the same thing then you cannot blanket 4x performance increase like this, because it will not be true.

True, and that's why I isolated what I was going to look for.

It's very important to specify what you are talking about. Both you and these supposed rumors you are referring to are not specifying what you are talking about. "We will get 2x-4x performance gains!" is most likely just as wrong as saying "we won't get 2x-4x performance".

The truth is probably somewhere in the middle.

 

 

18 minutes ago, leadeater said:

If you can get a collection of 10 games once the RTX 40 series releases with ONLY RTX Ray Tracing enabled and no DLSS so native vs native then I will agree there has been a 4x increase in RT performance.

I feel like you area already moving the goalpost.

I specifically said the claims would probably be true if we included DLSS and your first respond is saying "you are not allowed to use DLSS".

 

 

20 minutes ago, leadeater said:

Again, outliers are not particularly relevant, while one game could achieve said claim Nvidia has actually made that's not a supporting argument to say RTX 40 series has improved by that much compared to RTX 30 series. If we are going to accept edge cases as objective truths then AMD GPUs have been the performance flagships multiples times over, which isn't true at all.

True, but we shouldn't pretend outliers doesn't exist either. They are worth mentioning if they are relevant but shouldn't be used as the basis of generalizations. That goes both ways.

 

 

21 minutes ago, leadeater said:

No I think there has to be a line where you must automatically raise skepticism, 80% feels about right for rumors.

You do you.

I personally don't feel like 80% is perfectly reasonable, but if you expect a 100% increase then you are totally unreasonable and should be ridiculed.

To me, they are close enough that they don't warrant such dramatic change in how they get treated. If it is reasonable to expect a 80% increase in performance then I think expecting a 100% increase in performance is fairly reasonable too.

 

Personally, I think the overall performance gain will probably land somewhere around 60-70% for non-ray traced gaming performance. For ray traced games it will probably vary a lot more but the ray tracing performance itself (not the FPS) will probably be above a doubling in performance. Maybe even 3x. That won't translate to 3x FPS though.

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, LAwLz said:

I feel like you area already moving the goalpost.

No I'm explicitly defining exactly what I meant in my original reply that you replied to.

 

Up to 2x/4x is not 2x/4x performance increase, as I said outliers not make a case hence Nvidia actual wording vs rumors and what people think and talk about.

 

Also, damn take a joke as it is. These wild performance claims have/were thrown around in past tech new rumor topics and Nvidia is exactly and specifically not actually claiming these. Anyone that took those rumors to heart really did get sucked in a bit.

 

21 minutes ago, LAwLz said:

When people say rasterization vs ray tracing on this forum, they are talking about gaming performance.

I quoted Nvidia RTX 40 announcement, it's Nvidia words not anyone on this forum or any other regular person. Since it's Nvidia saying "Gaming Performance" and its marketing then I shall always apply no more than what is said, since they didn't specifically say rasterization then I'm not going to automatically assume it is, especially given this is Nvidia.

 

21 minutes ago, LAwLz said:

True, but we shouldn't pretend outliers doesn't exist either. They are worth mentioning if they are relevant but shouldn't be used as the basis of generalizations. That goes both ways.

My point you have been arguing against right now.... over an obvious joke...

 

21 minutes ago, LAwLz said:

You do you.

I personally don't feel like 80% is perfectly reasonable, but if you expect a 100% increase then you are totally unreasonable and should be ridiculed.

To me, they are close enough that they don't warrant such dramatic change in how they get treated. If it is reasonable to expect a 80% increase in performance then I think expecting a 100% increase in performance is fairly reasonable too.

You literally do not get the point at all. This isn't what I think is a reasonable expectation of performance, this is what I think is reasonable response to rumors and claims of performance increases. We already know rumors are not accurate, you'll argue whatever line is drawn so this doesn't even matter. Simply rumor claims of over 80% performance increase should have an automatic skepticism applied. Being skeptical isn't saying something is false, it's saying it's unlikely be true. Be weary, be objective, analyze.

 

P.S. And no if we include XYZ then the statement has been changed, goalpost shifted and I'll not accept that just as much as you will not. It either is or is not, not "is but only if".

 

I feel like you are just arguing because you like arguing and have no actual purpose behind it, nor being constructive either I might add.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

No I'm explicitly defining exactly what I meant in my original reply that you replied to.

No you didn't.

You just said it won't be 2x-4x faster.

It most certainly will. What we don't know is in which scenarios and how relevant those are. In your mind you might have been specific because you know you were talking about overall performance in terms of "raw horsepower", but since you didn't actually specify that in your post it is hard to tell for outsiders reading your posts.

Again, if we don't use the same definition of words then we just end up talking past each other. I would like for that to change because it is very annoying.

 

 

9 minutes ago, leadeater said:

Up to 2x/4x is not 2x/4x performance increase, as I said outliers not make a case hence Nvidia actual wording vs rumors and what people think and talk about.

You are not exactly helping the situation of people misunderstanding what was being said by making sweeping generalizations in the opposite direction.

 

 

Nvidia - It will happen in some situations.

Supposed rumors you have read - It will happen!

You - It won't happen!

Me - It will probably happen in some situations.

 

 

11 minutes ago, leadeater said:

Also, damn take a joke as it is. These wild performance claims have/were thrown around in past tech new rumor topics and Nvidia is exactly and specifically not actually claiming these. Anyone that took those rumors to heart really did get sucked in a bit.

I haven't seen those rumors but I don't read much rumors to begin with. Anyway, if there are rumors of that sort then isn't it a good idea to explain the situation so that people hopefully get the correct impression, rather than add fuel to the flame by making other, probably equally incorrect statements?

Yes, "it was a joke" but the first part of your post, which is the part I got issue with, does not read like a joke and I believe will give an equally false impression as the rumors does. Although expectations being exceeded is probably not as bad as expectations not being met.

 

 

 

22 minutes ago, leadeater said:

Simply rumor claims of over 80% performance increase should have an automatic skepticism applied. Being skepticism isn't saying something is false, it's saying it's unlikely be true. Be weary, be objective, analyze.

Completely agree that people should be skeptical. But in this case you weren't skeptical. You said it was false, which is what I objected to.

In before "I didn't say it was false, I just said people who think it is true are high as a joke". That is the same as saying it is false. That is how other people interpret your post. You might not think they do because your intentions were to say something else, but that is what your post says.

 

Person 1: "The temperature of Ryzen 7000 chips isn't really an issue".

Person 2: "Ryzen 7000 CPUs runs at close to 100 degrees. If you have one then I hope you got a fire extinguisher in your room lol".

I can claim that this is a joke as well and that they didn't really say the temperature is an issue, but that is how most people reading the post will interpret it. Same with your post where you jokingly says it is stupid to believe the 4090 will get 2x-4x performance gains over the 3090 Ti.

Link to comment
Share on other sites

Link to post
Share on other sites

Another 60% on top of the already stupidly fast 3090 Ti? Maybe we'll finally be able to play Cyberpunk at 4K + ray tracing at Ultra/Psycho. Maybe even with DLSS on quality? Performance looks impressive.

 

Power consumption is another thing though. I'm not gonna pretend that customers of this kind of product even care about the electricity bill, but the PC still has to be able to handle the power draw and heat. For most people this thing will not be a drop-in replacement. You'll likely need a new PSU aswell, in my case even a water block. So realistically upgrade cost will be in the ball park of $2000 USD for most people, provided the MSRP actually means something this generation.

 

I'm not gonna buy one, but i still find the perfromance leaps interesting and will keep an eye out for 3rd party benchmark results.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

So what's the power consumption per performance on these 2? 

 

What's the cost vs. performance?  I mean my rough math says that the 3090ti is 68% the cost of the 4090.

 

Seeing performance gains in a vacuum is all well and good, but where does the rubber meet the road?  45% cost increase (or more) for something at best 65% gain--that's getting borderline marginal.

 

At a $1300 price point, sure.  But at $1600-1800 price point...this isn't really groundbreaking.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, IPD said:

So what's the power consumption per performance on these 2? 

 

What's the cost vs. performance?  I mean my rough math says that the 3090ti is 68% the cost of the 4090.

 

Seeing performance gains in a vacuum is all well and good, but where does the rubber meet the road?  45% cost increase (or more) for something at best 65% gain--that's getting borderline marginal.

 

At a $1300 price point, sure.  But at $1600-1800 price point...this isn't really groundbreaking.

The super top of the line cards has never been about efficiency or price to performance.

If you care about price to performance, get a lower end card. That advice has been true ever since I got into computer hardware.

 

We don't have any third party numbers so take this with a massive shovel of salt, but Nvidia claims the same power consumption for this card as the 3090 Ti, which means efficiency is also up 65% or whatever it was.

 

 

My guess, and hope, is that the prices will drop quite quickly. AMD and Intel will hopefully push prices down when their cards gets released, and I think the market as a whole won't really buy these cards at these prices. Especially not since we are heading towards a recession.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, LAwLz said:

The super top of the line cards has never been about efficiency or price to performance.

If you care about price to performance, get a lower end card. That advice has been true ever since I got into computer hardware.

 

We don't have any third party numbers so take this with a massive shovel of salt, but Nvidia claims the same power consumption for this card as the 3090 Ti, which means efficiency is also up 65% or whatever it was.

If the 4090 managed to get 60% more performance while staying in the same power envelope as the 3090 Ti that's a mighty impressive performance per watt boost.

 

15 minutes ago, LAwLz said:

My guess, and hope, is that the prices will drop quite quickly. AMD and Intel will hopefully push prices down when their cards gets released, and I think the market as a whole won't really buy these cards at these prices. Especially not since we are heading towards a recession.

AMD maybe, but Intel? I don't think so. If their current top model sits around the same performance as the 3060 they won't really impact the high-end market. And with AMD it depends on if they're capable of a similar offering. As long as Nvidia has the best performing card, even if it's only by 5%, they're "entitled" to price it however they want. There are more than enough customers that just want the best, no matter the cost.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Stahlmann said:

AMD maybe, but Intel? I don't think so. If their current top model sits around the same performance as the 3060 they won't really impact the high-end market. And with AMD it depends on if they're capable of a similar offering. As long as Nvidia has the best performing card, even if it's only by 5%, they're "entitled" to price it however they want. There are more than enough customers that just want the best, no matter the cost.

If we are talking about the highest end products then yeah, AMD is our only hope. That and pricing just falling naturally as the product gets older.

But if we are talking about the mid to low end then I think Intel has a fighting chance to bring prices down. Hopefully Nvidia won't price the 3060 at like 500 dollars.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, LAwLz said:

The x2 rasterization performance is probably true as well, but only in certain scenarios. Which scenarios remains to be seen. It might be in most scenarios. It might only be in 4K or even 8K gaming.

All the speed improvement does is improve bottlenecks.  You still may hit a CPU limit at 20% higher FPS for example.  In that instance it isn't the power of the GPU holding you back.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, LAwLz said:

 I'd argue that if you got a 3000 series card then you shouldn't be buying something new at all. People need to stop being brainwashed by marketing and influencers into thinking "buying things" is a hobby.

Just keep using what you already got. I promise you that spending 1000+ dollars on a new graphics card to go from 160 FPS to 200 FPS won't make the game you play more enjoyable. You get a small dose of dopamine when you buy the thing and then it's gone in one week when whichever influencer you follow start hyping up the next gadget.

It really depends on the game, resolution, and fluff turn on in your settings. But if you don't see any difference on the game(s) you play, then you're right, although being an enthusiast kinda means you seek whatever best thing you can get and make it look nice I guess, without thinking of what you actually "need".

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, ewitte said:

All the speed improvement does is improve bottlenecks.  You still may hit a CPU limit at 20% higher FPS for example.  In that instance it isn't the power of the GPU holding you back.

Basically in all the non-shooter 7000fps+ scenarios the CPU bottleneck is way way ahead of any GPU limit. I'm looking forward to finally reaching stable 60fps+ at 4K with eyecandy, RT, and hopefully no or little DLSS. CPU is a 6700K.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, ewitte said:

All the speed improvement does is improve bottlenecks.  You still may hit a CPU limit at 20% higher FPS for example.  In that instance it isn't the power of the GPU holding you back.

These GPUs we're talking about have reached such ridiculous performance numbers that they basically only make sense at 4K and up. And as soon as you're switching to 4K you're gonna be GPU bound 99% of the time. But if this performance trend continues we might look at such high framerates at 4K in a few years that CPU bottlenecks will start to pop up again.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, Stahlmann said:

These GPUs we're talking about have reached such ridiculous performance numbers that they basically only make sense at 4K and up. And as soon as you're switching to 4K you're gonna be GPU bound 99% of the time. But if this performance trend continues we might look at such high framerates at 4K in a few years that CPU bottlenecks will start to pop up again.

The 4090 already gets 4k fps over my max refresh rate quite often based off the numbers that have been showed.  I have my cap set to 118.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

The super top of the line cards has never been about efficiency or price to performance.

If you care about price to performance, get a lower end card. That advice has been true ever since I got into computer hardware.

 

We don't have any third party numbers so take this with a massive shovel of salt, but Nvidia claims the same power consumption for this card as the 3090 Ti, which means efficiency is also up 65% or whatever it was.

 

 

My guess, and hope, is that the prices will drop quite quickly. AMD and Intel will hopefully push prices down when their cards gets released, and I think the market as a whole won't really buy these cards at these prices. Especially not since we are heading towards a recession.

Wasn't asking for comparison to midrange.  Comparing between tiers isn't apples to apples.  Comparing top-end vs. top-end is.  That's why I asked.

 

Even if power consumption is the same though, You're asking for a 45% or more (minimum) premium for maybe a 65% increase (maximum) in performance.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×