Jump to content

Tongue-tied Ti(e) - Nvidia Allegedly Orders Partners to Halt RTX 3090 Ti Production

Lightwreather
1 hour ago, Kisai said:

Also I don't know anyone silly enough to spend more on a monitor than their GPU or console. You buy a monitor or television and you keep it until it dies, which can be as long as 15 years.

You just said yourself why it can make sense to spend a lot of money on a good monitor. Typically it stays in your setup for a lot longer than most PC components. But like with everything there are enthusiasts that always want to have the latest and greatest so monitors last 1 or 2 years tops. 

 

1 hour ago, Kisai said:

And yes, 100% of consoles are still doing 1080p unless you've somehow got the magic unobtanium wand and created a PS5 out of thin air. Yes they can do 4kp120 if you have a monitor or television that was released in the last two years that has HDMI 2.1 on it. But guess what, all those existing 4K TV and Monitors? 4kp60 if they support HDMI 2.0 only. 4kp120 on 4:2:0 content , but good luck, as no films or tv shows are released like that. 

PS4 Pro and Xbox One X can do 4K60, which is what most 4K TV's with a HDMI 2.0b port support.

 

1 hour ago, Kisai said:

And the PS4 Pro? most games are 1080p60, and even less are HDR, and those that claim to be 4K are actually upsampled to 4K or only running at 4kp30, they aren't 4k60.

https://www.eurogamer.net/articles/digitalfoundry-2016-4k-gaming-on-ps4-pro-tech-analysis

Even if it's upscaled 4K it still outputs a 4K image. My point still stands that current (and even last gen) consoles support 4K, which is the standard for TV's nowadays. If the devs decide to implement it at 30 60 or 120 fps doesn't really matter to my point.

 

1 hour ago, Kisai said:

Those graphs still indicate a 3060 is 1080p60 performance, so I don't know what you're trying to prove, and doesn't disprove what I've said. 

Those graphs literally show that the 3060 is capable of over 100 fps on average on 1080p. In fact it's still averaging 80+ fps @ 1440p. So saying the current XX60-series is a 1080p60 tier card is just wrong.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, IWannaBeUniqueMom said:

What’s wrong with your spelling and grammar?

Jesus Christ.

Like actually nothing....

 

I legitimately hope English is not your first language because it is mine and I can assure you spelling and grammar wise my post is fine, other than "Buy the time" heh but that hardly warrants a pointless post to point that out lol

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Kisai said:

Those graphs still indicate a 3060 is 1080p60 performance, so I don't know what you're trying to prove, and doesn't disprove what I've said. 

How when it's doing an average of double that at 1080p and almost 50% more at 1440p. That is like saying the 3060 is basically the same as a 3070 with the margin of error in that statement.

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Stahlmann said:

You just said yourself why it can make sense to spend a lot of money on a good monitor. Typically it stays in your setup for a lot longer than most PC components. But like with everything there are enthusiasts that always want to have the latest and greatest so monitors last 1 or 2 years tops. 

There is a difference between upgrading from a 720p to a 1080p TV, because there is a noticable increase in resolution and logically the larger the monitor the higher the resolution should be. But the average person just buys a TV and it sits parked there until it dies. With old CRT's it was not unheard of to still be using a 42" CRT from 1975 all the way into 2000, because it still worked. It's only been the last 25 years or so where connectors keep changing on the back of the TV and people get annoyed and replace the damn thing when the connectors don't match.

 

Personally, I don't even have a TV. I've using the 1080p monitor as a TV when I need to use "A TV" because it lacks the smartTV trash.

 

 

19 minutes ago, Stahlmann said:

 

PS4 Pro and Xbox One X can do 4K60, which is what most 4K TV's with a HDMI 2.0b port support.

But not what the PS4 Pro actually can do.

 

 

19 minutes ago, Stahlmann said:

Even if it's upscaled 4K it still outputs a 4K image. My point still stands that current (and even last gen) consoles support 4K, which is the standard for TV's nowadays. If the devs decide to implement it at 30 60 or 120 fps doesn't really matter to my point.

No no, PS4 games were not designed for 4K. After-the-fact some of them were enabled, but just look at all the "exceptions"

 

https://www.gamesradar.com/ps4-pro-confirmed-games-list/

Quote
  • The Last of Us Remastered - 2560×1440 resolution max, choice between 60fps at 1800p resolution or 30fps at 2160p, HDR support.
  • Monster Hunter World - 1800p, graphics or frame rate options (both 1080p), HDR
  • Nier: Automata - 1080p, 60fps, improved lighting, shadows, anti aliasing, texture filtering and motion blur. 
  • Shadow of the Tomb Raider - 4K Resolution mode, 4K at 30 FPS, and High Resolution, 1080p at 60 FPS. HDR, improved physically-based rendering, hardware tessellation, anisotropic filtering, additional dynamic foliage, and more.

You get the option of 4Kp30 or 1080p60 in most cases. Not 4kp60, and not full 4k resolution in many of them.

 

 

 

19 minutes ago, Stahlmann said:

Those graphs literally show that the 3060 is capable of over 100 fps on average on 1080p. In fact it's still averaging 80+ fps @ 1440p. So saying the current XX60-series is a 1080p60 tier card is just wrong.

That doesn't count unless you can buy a monitor that operates at that natively. Monitors only operate in multiples of 60 or 72.

2560x1440 is an awful resolution to play at, because it results in nothing being integer scaled. So 1080p games and video look like vaseline has been smeared on the screen. No thanks.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

33 minutes ago, Kisai said:

There is a difference between upgrading from a 720p to a 1080p TV, because there is a noticable increase in resolution and logically the larger the monitor the higher the resolution should be. But the average person just buys a TV and it sits parked there until it dies. With old CRT's it was not unheard of to still be using a 42" CRT from 1975 all the way into 2000, because it still worked. It's only been the last 25 years or so where connectors keep changing on the back of the TV and people get annoyed and replace the damn thing when the connectors don't match.

I wouldn't say i'd replace my TV because the connectors don't match. If you take a 4K 50/60Hz TV and it has HDMI 2.0 there is no reason to cry on about it. But lately it's a trend that people cry about newer displays not having HDMI 2.1, even though the monitor or TV doesn't even need it in terms of bandwidth. I can't count the times i've seen people say i want to buy X 1440p 144Hz monitor, but it doesn't have HDMI 2.1. It fucking doesn't matter. 2.0 is everything it needs.

 

But all in all it's basically been HDMI all the way since it has been introduced. I don't see your problem of changing connectors.

 

33 minutes ago, Kisai said:

Personally, I don't even have a TV. I've using the 1080p monitor as a TV when I need to use "A TV" because it lacks the smartTV trash.

I'm glad my C9 has a smart interface. That keeps me from having to connect a media PC, Shield or Fire TV stick. Otherwise i'd have to use a friggin mouse and keyobard or even a 2nd remote to use it.

 

33 minutes ago, Kisai said:

2560x1440 is an awful resolution to play at, because it results in nothing being integer scaled. So 1080p games and video look like vaseline has been smeared on the screen. No thanks.

Expept 99% of all relevant games support this resolution natively. Plus if you're really desperate for older games 720p can be used for integer scaling.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Kisai said:

2560x1440 is an awful resolution to play at, because it results in nothing being integer scaled. So 1080p games and video look like vaseline has been smeared on the screen. No thanks.

How old are the games you are running? Set the game resolution to 2560x1440 and done. Been running 2560x1600 for yearrrrrrs, this is not a problem.

Link to comment
Share on other sites

Link to post
Share on other sites

i just wanna chime in and say that altho 1080p is the most used resolution today, doesn't mean it should be the most used resolution forever

more gamers are probably buying 1440p displays today compared to yesteryear, the marketshare for it should be growing compared to before
 

people buy what people want, i'd say x60 cards are perfectly adequate for majority of the gamers out there but some want more, some can afford less

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Moonzy said:

i just wanna chime in and say that altho 1080p is the most used resolution today, doesn't mean it should be the most used resolution forever

more gamers are probably buying 1440p displays today compared to yesteryear, the marketshare for it should be growing compared to before
 

people buy what people want, i'd say x60 cards are perfectly adequate for majority of the gamers out there but some want more, some can afford less

Agree and disagree.

 

I do think the average resolution will increase over time.

 

But I also think it's barely happening right now. Those who can get their hands on a good GPU that can actually push those resolutions at decent framerates, and without filing for bankruptcy, sure.

But otherwise, the vast majority are still probably locked to 1080p because oh lord, do I buy a new GPU or do I keep my kidneys?

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, Rauten said:

But I also think it's barely happening right now. Those who can get their hands on a good GPU that can actually push those resolutions at decent framerates, and without filing for bankruptcy, sure.

1440p monitor prices seems to be dropping quite fast, so i expect people to grab it more than before

1 hour ago, Moonzy said:

more gamers are probably buying 1440p displays today compared to yesteryear

 

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Moonzy said:

1440p monitor prices seems to be dropping quite fast, so i expect people to grab it more than before

Yes, but the problem isn't the price of the monitors, it's the price and availability of GPUs that can make use of those monitors.

Why buy a 1440p monitor if you're stuck with a 1060 or a 1050Ti?

🤷‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Rauten said:

Yes, but the problem isn't the price of the monitors, it's the price and availability of GPUs that can make use of those monitors.

Why buy a 1440p monitor if you're stuck with a 1060 or a 1050Ti?

🤷‍♂️

there's no rule saying you need to have a gpu capable of pushing 1440p before buying a 1440p monitor

some people buy their equipments piece by piece

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Rauten said:

Why buy a 1440p monitor if you're stuck with a 1060 or a 1050Ti?

Why not. Old monitor could have broken, or you just want a newer one for many reasons like you want a larger one or a second one, or would like to have a feature like VRR because you can or will be able to use it and there is a good deal on the monitor now.

 

Not every purchase has to be time aligned with other parts, like why did I buy a 6800 XT to put in a 4930K system? Or why buy a new CPU and motherboard if you're just going to stick an old GPU in it, waiting forever means you wait forever or pass up what is the best deal you'll get for a while simply because you couldn't get one of things you wanted now or at the price you want to pay.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Moonzy said:

there's no rule saying you need to have a gpu capable of pushing 1440p before buying a 1440p monitor

some people buy their equipments piece by piece

Yup,  I would fall into that category. And if the market wasn't completely FUBAR, sure. But right now? Not knowing when, or hell, if, you'll be able to get a good GPU for your system? Just seems foolish to me. By the time you do get the GPU, odds are monitors with the specs you want are cheaper, or you can get something even better for the same price.

 

Plus, when it comes to gaming, GPU >> monitor. Makes more sense to save up the money for a potential future GPU upgrade than spending it now on a monitor and not having it later on for the GPU you want.

A 1080p60 monitor run by a 3070 makes no goddamn sense, but your games will run like a dream at max settings and with all the bling you can throw at it.

But a 1440p144Hz monitor run by a 1060 will struggle hard and you'll probably have to drop settings left and right to make the game playable at all (sans e-sports).

 

7 minutes ago, leadeater said:

Why not. Old monitor could have broken, or you just want a newer one for many reasons like you want a larger one or a second one, or would like to have a feature like VRR because you can or will be able to use it and there is a good deal on the monitor now.

Again, to save up for the GPU. We are mostly talking gaming here, and in that aspect, GPU >> Monitor.

If you do other stuff and want a monitor for media consumption, multi-screen setup for work, or you want something specific for picture/video editing... sure, makes sense. But then does it really make sense to buy a 1440p monitor? That resolution basically exists for gaming. For anything else you're probably better off with 4k60, and you can get some stupidly cheap.
Hell, I'm using a 4k60 as secondary monitor at home because it was so much cheaper than a 1440p60.

7 minutes ago, leadeater said:

Not every purchase has to be time aligned with other parts, like why did I buy a 6800 XT to put in a 4930K system? Or why buy a new CPU and motherboard if you're just going to stick an old GPU in it, waiting forever means you wait forever or pass up what is the best deal you'll get for a while simply because you couldn't get the one thing you wanted now or at the price you want to pay.

I get what you're trying to say, but I don't think it's quite the same, not right now.


Ok, you got a super-powerful GPU and put it into a PC that's nearly a decade old. It's still going to be a huge improvement over whatever you had before and, like we all know, there's always a bottleneck somewhere in the system. And with the way the market is right now, it makes more sense to secure the GPU now, if you can, and wait for the CPU/Mobo/RAM combo which are far more accessible and at more reasonable (sorta... DDR5...) prices.

 

And buying a new CPU/Mobo/RAM combo and keeping an old GPU can also make sense if your system was way too underpowered and was causing other issues.
I have a friend that I wish would save up for that upgrade, even if he still uses his old GTX960 because he has a crappy i5 4460T (2.9Ghz max boost!) and 8GB of ram and it's literally choking on everything he tries to play nowadays. 
For such a case, a CPU/Mobo/RAM combo upgrade would be amazing.

 

But a new monitor, a new gaming monitor, without the GPU for it? Sorry but I just don't see it. I still think it makes no damn sense in the current market.

 

Holy crap sorry for the wall of text

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, Rauten said:

Again, to save up for the GPU. We are mostly talking gaming here, and in that aspect, GPU >> Monitor.

Well one of your key reasonings hinges on the fact you think or consider the current prices to be too much, the fault there is the assumption that prices will actually meaningfully go down. Just like every property market analyst ever in the history of my country and many others their predictions of doom and falling prices never happen, and they keep saying it every few years. Not being able to afford a GPU or the one you want is one thing, refusing to buy because it costs more than you are willing to is another.

 

I personally think the ship has sailed on pricing, I agree they will go down however top end GPUs are never going to go back down to the range people are wanting to grasp on to. I paid more for my 6800 XT than I did for my 4930K, Asus Rampage IV BE, 16GB DDR3-2400, 2x R9 290X, Samsung 840 Pro 256GB. And that 6800 XT was a good deal, $300 USD less than every other one in stock and was a Powercolor Liquid Devil so comes with the EK water block already on it and ready to go in my system, because whatever I purchased was going to have a water block put on it further increasing the price. I saved around $500 compared to every other option, there were 2 left and never going to be restocked at that price again, I'd have been a fool not to go for it just because the cost was still vastly higher than 3-5 years ago.

 

If a good deal on a monitor comes around there is no need to ignore it and complain just because you can't get a new fancy GPU for a price you would prefer to pay for it, buy it and enjoy it, you'll have it for many years. Similarly my monitor is 9 years old, Dell U3014, and I still do not feel the need to upgrade it.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, leadeater said:

Well one of your key reasonings hinges on the fact you think or consider the current prices to be too much, the fault there is the assumption that prices will actually meaningfully go down.

Well aside from considering the prices to be too high (which they are), the real problem is...

18 minutes ago, leadeater said:

Not being able to afford a GPU or the one you want is one thing, refusing to buy because it costs more than you are willing to is another.

I'll go with not being able to afford it.

Many of us here can afford it. Many of us here cannot. And in general, I'd say that a large chunk of the population would fall on the "cannot" basket, either because they literally don't have the money for it, or because they do have it but it'd be a horrendous decision that could have a severe negative impact on their finances/livelihood.

 

Most people refusing to buy (like yours truly), by the very definition of "refusing to buy", can.
But we just don't want to because f*** those prices and/or my current GPU can tide me over for now.

 

But I'd say we are a relative minority. Maybe not so much here in this forum, but we are also kind of an echo chamber - if we actively participate here then we are tech enthusiasts and therefore more willing to spend large sums of moolah in PC components, if we have the bank account for it.

 

18 minutes ago, leadeater said:

I personally think the ship has sailed on pricing, I agree they will go down however top end GPUs are never going to go back down to the range people are wanting to grasp on to.

Oh, I fully agree. My plan is to preorder (ugh... I think I just died a little inside...) as soon as an interesting RTX4000/RX7000 or whatever gets announced, before the price skyrockets again.

18 minutes ago, leadeater said:

my 6800 XT ..... Powercolor Liquid Devil so comes with the EK water block already on it.

Nice. Sexy.

18 minutes ago, leadeater said:

If a good deal on a monitor comes around there is no need to ignore it and complain just because you can't get a new fancy GPU for a price you would prefer to pay for it, buy it and enjoy it, you'll have it for many years. Similarly my monitor is 9 years old, Dell U3014, and I still do not feel the need to upgrade it.

In your case, if you have no monetary problems, maybe, but again, I'd say most people just can't spend money on PCs/tech like this.

 

Also, buy it and enjoy it - that's my whole point, how are you going to enjoy a fancy new monitor if your GPU isn't powerful enough to make use of the increased resolution/framerate?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Rauten said:

Again, to save up for the GPU. We are mostly talking gaming here, and in that aspect, GPU >> Monitor.

If you do other stuff and want a monitor for media consumption, multi-screen setup for work, or you want something specific for picture/video editing... sure, makes sense. But then does it really make sense to buy a 1440p monitor? That resolution basically exists for gaming. For anything else you're probably better off with 4k60, and you can get some stupidly cheap.
Hell, I'm using a 4k60 as secondary monitor at home because it was so much cheaper than a 1440p60.

I agree it makes no sense to buy any parts one by one right now, because who knows when, or if GPU's will return to more normal pricing. But a 1440p monitor is also nice for photo editing, its a nice upgrade from 1080p, for video editing a 4K monitor might make more sense.

Although mini LED monitors are coming out, and have much better features than standard LED backlit monitors, and might be worth waiting for if you're willing to pay extra because its new.

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, leadeater said:

Well one of your key reasonings hinges on the fact you think or consider the current prices to be too much, the fault there is the assumption that prices will actually meaningfully go down. Just like every property market analyst ever in the history of my country and many others their predictions of doom and falling prices never happen, and they keep saying it every few years. Not being able to afford a GPU or the one you want is one thing, refusing to buy because it costs more than you are willing to is another.

 

I personally think the ship has sailed on pricing, I agree they will go down however top end GPUs are never going to go back down to the range people are wanting to grasp on to. I paid more for my 6800 XT than I did for my 4930K, Asus Rampage IV BE, 16GB DDR3-2400, 2x R9 290X, Samsung 840 Pro 256GB. And that 6800 XT was a good deal, $300 USD less than every other one in stock and was a Powercolor Liquid Devil so comes with the EK water block already on it and ready to go in my system, because whatever I purchased was going to have a water block put on it further increasing the price. I saved around $500 compared to every other option, there were 2 left and never going to be restocked at that price again, I'd have been a fool not to go for it just because the cost was still vastly higher than 3-5 years ago.

 

If a good deal on a monitor comes around there is no need to ignore it and complain just because you can't get a new fancy GPU for a price you would prefer to pay for it, buy it and enjoy it, you'll have it for many years. Similarly my monitor is 9 years old, Dell U3014, and I still do not feel the need to upgrade it.

Your reasoning seems to be based on buying something now or someone might have FOMO because they missed out on a deal, even though the GPU is still overpriced way above what it should be.

I personally refuse to pay $1500 for what should be a $800 GPU at most, yeah I doubt the high end will ever come back to any reasonable pricing but I don't like it, and I wouldn't mind settling for less. I don't play the latest AAA games anymore anyway as they're either too expensive or too buggy to be worth paying full price.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Rauten said:

My plan is to preorder (ugh... I think I just died a little inside...) as soon as an interesting RTX4000/RX7000 or whatever gets announced, before the price skyrockets again.

nvidia stopped MSRP, i doubt they'll bring it back until things become sane again

 

16 minutes ago, Rauten said:

how are you going to enjoy a fancy new monitor if your GPU isn't powerful enough to make use of the increased resolution/framerate?

there are games that runs fine at 1440p 144hz on weak GPU

not everyone plays AAA games

 

there are also other things u can do on a PC

 

for example, I've a friend with 1060, 1080p60 screen, he wants to experience 144hz, so i told him to might as well get a 1440p144 and upgrade his gpu later

 

12 minutes ago, Blademaster91 said:

I personally refuse to pay $1500 for what should be a $800 GPU at most

pretty much why nvidia stopped giving MSRP, let market dictate the price

 

12 minutes ago, Blademaster91 said:

I doubt the high end will ever come back to any reasonable pricing

ditto, inflation is OP for the past couple years

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, Rauten said:

Also, buy it and enjoy it - that's my whole point, how are you going to enjoy a fancy new monitor if your GPU isn't powerful enough to make use of the increased resolution/framerate?

Well that does highly depend on what you have and similarly any settings you are willing to turn down that probably do nothing visually to improve the image quality and just lower the frame rate.

 

For example you could have a GPU that does 90 FPS average across a wide selection of games at 1080p with Max/Ultra settings and will do 60 FPS if you were to upgrade to 1440p. How important is framerate will depend on the person and also the games, however you don't have to run with every setting on Max/Ultra so it's reasonably fair to say it's possible to get to or close to that previous 90 FPS while having equivalent or possibly better image quality due to the higher resolution.

 

Honestly a lot of gaming enthusiasts only use the graphical presets and will always aim for the highest one possible at their own detriment. I personally set everything to Max/Ultra as well but as my GPU ages and I'm not getting the performance I personally find sufficient I start turning things down rather than looking to buy a new GPU, because I know many options at Max/Ultra don't "do anything" compared to High other than reduce frame rate and/or consume VRAM.

 

Right now I'd say more people would benefit from learning or be willing to optimize graphical settings than buying a new GPU 🤷‍♂️

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, Blademaster91 said:

even though the GPU is still overpriced way above what it should be.

It's based on that it's price how it is and will be for a long time and IF it goes down it won't be nearly as much I bet you are thinking or hoping.

 

39 minutes ago, Blademaster91 said:

Your reasoning seems to be based on buying something now or someone might have FOMO because they missed out on a deal

It's not FOMO at all, it's literally pointing out if you wait forever you wait forever and get nothing. If you want something and can get it but conversely refuse to buy it because it's too much that is your choice, going ahead and buying it is not root caused in FOMO at all.

 

Do whatever you want but I'll more than likely see you in 2 years with still no current generation GPU because the pricing is the same as it is now and if you've been waiting 2-3 years already then all you've done is extend your absence of having a GPU by a further 2 years or relented and purchased a previous generation(s) option.

 

Basically you haven't achieved anything other than continued to stay being annoyed at the situation and I'd ask where exactly does that get you overall?

 

Not buying in the hopes and prayers that pricing will go down and you'll regret spending the money is literally also FOMO, fear of missing out on the lower price.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, leadeater said:

Right now I'd say more people would benefit from learning or be willing to optimize graphical settings than buying a new GPU 🤷‍♂️

People learning how to use their computers? Yeah I'm going to defer you to your avatar.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, leadeater said:

Honestly a lot of gaming enthusiasts only use the graphical presets and will always aim for the highest one possible at their own detriment. I personally set everything to Max/Ultra as well but as my GPU ages and I'm not getting the performance I personally find sufficient I start turning things down rather than looking to buy a new GPU, because I know many options at Max/Ultra don't "do anything" compared to High other than reduce frame rate and/or consume VRAM.

I think many people just want to play the game. Pick the "best" preset that balances quality with acceptable framerate and away you go. I do find that on many games the difference between high and ultra is so small that unless you're comparing pixels between the two settings you'll probably never notice it in actual play.

 

12 minutes ago, leadeater said:

Right now I'd say more people would benefit from learning or be willing to optimize graphical settings than buying a new GPU 🤷‍♂️

For what were high end gamers this is the case, but I feel those on a budget already had to learn how to best optimise settings already. Gone are the days when you can buy a high end GPU without breaking the bank and overpower pretty much all games. Still we have more tools to potentially use with the upscaling options, so that may help somewhat.

 

Also not going back to find the quote but on the housing comparison, I feel that is a little different. If you don't have a house or somewhere to stay, you're probably not having a good time in life. Demand keeps increasing, and supply is not keeping up. It wont get resolved until that is resolved. With GPUs, it may feel similar but I'd argue the demand is not as critical. We do have potential events on the horizon that may help out with the supply and/or demand parts (Intel Arc launch, Ethereum maybe finally going POS), as well as longer term ramps in production capacity. Ram in particular has shown cycling between over and under supply for example. Just we're seeing it affecting more areas now.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, porina said:

For what were high end gamers this is the case, but I feel those on a budget already had to learn how to best optimise settings already.

Certainly lived that life for a very long time. I went through many computers without having a single new part in it, across multiple generations. But that's not as good of an option now days sadly.

 

6 minutes ago, porina said:

Gone are the days when you can buy a high end GPU without breaking the bank and overpower pretty much all games.

True however the most vocal seem to be those that did previously buy a high end GPU and now can no longer or are unwilling to pay the current market price. So while this is most definitely the situation I don't see it changing and these vocal ones realistically only have one choice, accept and move on. The moving on may have more than one path but acceptance must come first.

Link to comment
Share on other sites

Link to post
Share on other sites

I completely agree with Kisai here. I did my own similar analysis and came to basically the same conclusion.

 

Another very interesting way to look at GPUs is to compare them based on power consumption. For example, the 1080Ti and 2080Ti had a tdp of 250W, which is comparable to the RTX 3070/RX 6800. This is where the top of the stack should be, but both nVidia and AMD have extended past this with their top GPUs due to AMD becoming more competitive. This is why Intel Arc was targeted at the 3070 as the high end imo.

 

The actual start of the high end GPU tier should actually be the RX 6600XT/RTX 3060 level as far as I can tell, with mid range below them.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, HorseBattery said:

Another very interesting way to look at GPUs is to compare them based on power consumption. For example, the 1080Ti and 2080Ti had a tdp of 250W, which is comparable to the RTX 3070/RX 6800. This is where the top of the stack should be

Why? Why should the limit of high end be 250W?

 

I'm all for efficiency, but why not make a 500W GPU that performs twice as fast?

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×