Jump to content

The Pizza Man

Member
  • Posts

    86
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Please... Check your privilege, because you're clearly in a steady enough financial position that you can afford to just do a rebuild with $600+ video cards and all the rest if you dip below 100fps. You need to appreciate you're the edge case, not the standardised consumer.
  2. The 4070 Super is a 1080p? Whelp. Looks like I've been priced out of 1440p gaming then.
  3. Any of these look like a pretty OK bump in performance but prices vary on the used market and mileage may vary if you're stepping up to a non-current GPU series. Unless you're absolutely dead set on buying new (see suggestions above), I'd try hunting down a used 6800 or 6800XT if I were in your position, so long as your PSU supports the extra power draw. EDIT formatting.
  4. I think you're getting confused about how the shared pool of memory works and its relationship between the high and low bandwidth memory pools and caching on the console SSD's - it's why the I/O throughput figures, and compression/decompression techniques at the time of console release were debated about so much. It doesn't work this way on PC where we still have separate caches of memory in terms of RAM, Storage, CPU cache, VRAM etc. This is besides the point which is: it's the not the developer's fault. So they're using 12GB of VRAM now because that much or more (or less) is available on consoles - Nvidia knows this and are spec'ing its products with barely enough VRAM so that by the time the next jump forward comes around for video memory, PC gamers HAVE to upgrade if they want a good experience. I'll say it again as others have been: Its just a form of planned obsolescence that AMD Radeon hasn't chosen to adopt. Objectively speaking, unless there's something wrong with their 4070ti, OP would be better off with a Radeon GPU if they're encountering VRAM problems in the games they play.
  5. Because they clearly are unaware of any reason why they'd pay the extra 200, and are checking with people that do know. It's not that they think that 2-400 dollars is a sensical bump in price at all. I wonder why you think paying for vbucks at all is sensical but I'm not questioning your motives or knowledge. Please go and sit down in a quiet room...
  6. ^ I'm blaming neither the developers nor Nvidia specifically for the way things have turned out, but we've had the Xbox Series and PS5 consoles for some time now. The reason we didn't immediately see VRAM requirements jump up is that nearly all multiplatform games released until recently have also been cross-generational and also cater to 8GB shared video and system memory pools on the older consoles. That has now doubled to 16GB for the very latest titles and this year (would have been sooner if not for COVID), both PlayStation and Xbox are beginning to release titles on their platforms that target only the 16GB shared memory pools on their latest consoles, which in turn has a knock-on effect to PC ports. While I'm sure that optimisation is a large part of it, why would developers optimise for lower than the 'minimum common factor' (ie: consoles usually set the precedent) that sets the bar for lower specs in the first place when releasing a multiplatform title? We're going to see bigger textures, more complex effects and physics, better support for ray tracing in titles that also drinks VRAM etc etc... You cite Cyberpunk on the 780ti as an example but let's be honest: it's not a great experience and while it does indeed run, a user probably shouldn't. It is plain to see how Nvidia can be blamed for their cards not lasting as long as they could by how they're spec'ing their cards for the 4000 series SKUs. It's the bare minimum. It falls under 'planned obsolescence' for me, but it seems easy enough for them to argue the contrary in a legal sense. ^ Well, yeah. People were saying this when the 780ti released, when the 980ti released and again when the 2080ti released. Of course 24GB will no longer be enough at some point. As game development progresses, as it gets more complex, we need more room for more data. Currently, it's the natural order of things. One day that might change, but it's not changing this year, or even in the next 10 years, I dare say. ^ Why? The shared memory pool space on consoles is there for developers to use. Why shouldn't they use the vast majority of it as VRAM, if that's what their game requires?
  7. I would just install their latest driver and see how it performs on your system, monitoring any crashes or weird behaviour as you go.
  8. I mean the easiest answer is to just say "stop buying Nvidia" and move on. I've got a 2080ti still going, but as soon as I can manage it I'm jumping over to AMD Radeon. This doesn't fix your issue though; games are only going to require more and more VRAM as time goes on. We're looking at getting a whole slew of new releases by the end of the year, too. That said, maybe this is callous and unfeeling of me, but if it takes a bit of buyer's remorse, some returns or non-sales to prod Nvidia to be better at pricing and spec'ing the cards in the first place, then so be it, I suppose? Is your card still within the returns period? I would seriously consider trading it in for a RX 7900XT, or returning it and splashing a little extra for the XTX.
  9. What I'm saying is that we get screwed on price most of the time because retailers consider pounds equal to dollars so if the MSRP says it's $600 then the UK MSRP will be £600, albeit inclusive of tax and regardless of the exchange rates. Yes I get the "happy accident" of VAT being at 20%, but also we've got a much higher tax burden here than pretty much any other developed Western country at the moment, sales tax excluded.
  10. I thought the 7900XTX sat above the 4080 and below the 4090 most of the time? Discounting DLSS, that is.
  11. I do sort of agree with the sentiment that AMD can't ignore their own 6950XT, because AFAIK they're still making this product, albeit likely in reduced quantities? If they stop making it and reduce the price to clear, releasing the 7800XT that has 10% more perf than the 6950XT, why would I buy the 7800XT at full price over a reduced cost previous gen than offers me all but the top end of performance? But equally, the march of progress stops for no one, and eventually 7000 series Radeon will be in full swing and 6000 series supplies will run dry, surely. I still think arming products with price points that are super aggressive against a competitor trying to play 5D chess into the future are AMDs best weapon to, at the very least, buy time and mind share again.
  12. Except that DLSS3 - especially the frame generation component - adds so much latency that I personally would rather avoid it for a better performance per dollar product. I'm aware that the 4070 actually, technically offers that right now, but I think things like DLSS muddy the waters a bit and not in a good way... I think? It's just not as easy as that, but I'm still open to anything. I do think that AMD have some compelling levers they're able to pull before any of that though.
  13. So in fact, you're saying their best bet is to actually drop the price of the 7900XT and XTX to push into the 4070's and 4080's price bracket with elbows out as aggressively as possible, then launch the lower end RX cards as competitively as they can, and have your 7950XT and XTX as your halo products?
  14. That's not how our tax system works. They don't just slap 20% on top of MSRPs. Anyways, it's the sellers and not the buyers that are responsible for delivering accurate VAT returns since they must account for that in their advertised prices (in most circumstances anyway), and I as a buyer don't have to return my tax receipts unless I've purposefully bought something free of tax. Sellers still want to compete after all. Anyways, these prices look about in line with $ MSRP
×