Jump to content

3080 or 3080 ti?

Longshot

Ahoi!

i have currently built a new pc and all i have left to buy is the graphics card.

i plan not to upgrade again for the next 5-6 years (but hopefully 7)

 

i know the 3080 ti hasnt launched yet or even been announced but with all the leaks of it being between the 3080 and 3090 and having 20gb of vram

 

i know currently it wont make much of a difference but like last gen over time Vram need increases drastically over time meaning the next gen could end up at around 15-20 gb of vram needed to play a game at 1440p and 90+ fps.

 

considering the price hasnt been announced yet and even when it is it will likely land in my country at a very similar price point (around 150-200$ difference i believe)

 

would you say its a better buy than the regular 3080 or considering my want to "futureproof" my pc which is the better buy for the money?

Link to comment
Share on other sites

Link to post
Share on other sites

It will be the better buy for "Futureproofness" but who knows when it's coming. If it is at all. But depending on the price it might also be terrible value, like the 6900XT.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Stahlmann said:

It will be the better buy for "Futureproofness" but who knows when it's coming. If it is at all. But depending on the price it might also be terrible value, like the 6900XT.

ya likely, i considered the 6900xt but felt overall that its far worse value than the 3080 due to the features like dlss and ray tracing but the only upside it has is the amount of vram which was my main concern.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Longshot said:

.

if ur gonna keep the card for 5-6 years just wait for the 3080ti for the vram, price is not offciial but it's 999 for msrp (AIB will probably be 1100-1300), i'm waiting on the 3080ti myself but i'd be very surprised if it's adequate for 5years, in fact cyberpunk already takes a fat dump on the 3090, so no futureproofing this gen, or ever really, the 1080ti looks to be good for 4 years from launch but that was a unicorn.

 

One thing of note, we got a 50% jump this gen, we'll likely get another 50% jump next gen. AMD has the performance crown with the 6900xt, time to see what nvidia's been holding back. I'm not suggesting to wait for it, but keep that in mind.

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

For gaming, nah, imo by the time more than 10gb is needed, these cards will be to slow to play at the resolution that needs that much gram anyway.

 

Tll;dr: You'll have to go down in resolution and/or reduce textures anyway since it will be too slow. 3080 already isn't a full feat 4k (where generally more gram is needed) gpu right now, let alone in the future.

 

$699 vs $999 or almost 43% more for a tiny bit of performance and extra useless vram doesn't seem like a good idea.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...
On 12/14/2020 at 7:45 PM, Mister Woof said:

by the time more than 10gb is needed, these cards will be to slow to play at the resolution that needs that much

Let me tell you a little story @Mister Woof:

The year was 2012. I had just bought a 3930k, 16GB's of 2133MHz DDR3 RAM, an ASUS Rampage IV Formula motherboard, together with an SSD etc. etc.

Basically a VERY high end setup for gaming and casual workstation related tasks in 2012, no doubt.

 

NOOO they screamed and told me: HAHAHHA you dummy. By the time games use more than 2GB of VRAM the actual GPU will be waaay too old.

 

The date was September 17th, 2013.

ONE, let me repeat - ONE!!!! year later. And my 6 core 12 thread machine overclocked to 4.6GHz water cooled with 16GB of quad channel memory and an ASUS GTX 680 Direct CU II couldn't max out GTA V @1080p, because guess what... it did not have enough VRAM.. 

 

Let me just remind everyone that the GTX 680 was basically Nvidia's best single GPU graphics card offering out there in 2012. If you wanted 4GB of VRAM you needed to buy something like a GTX 690, which was a DUAL GPU card. I believe you could also get your hands on a GTX 680 4GB version, but those were expensive and nobody needed more than 2GB of VRAM right?

 

Btw, @Mister Woof please don't take this the wrong way. I am in no way trying to make my comment seem aggressive or trying to be rude, trust me, but let's get serious here for a second.

A premium high end configuration in 2012 SHOULD have been able to play AAA games at max settings for at least a couple of years - and that is the bottom line. I know, I know - Moore's law comments incoming. HOWEVER: if you PAY for ultra high end performance, you EXPECT ultra high end performance and nothing can save you from feeling a bit ripped off.

 

After the GTX 680 came the GTX 780 with DOUBLE the VRAM (4GB).

The way things stand today: mid - high end cards with 8GB of VRAM have been out for 4 years now. 10GB's in a card worth $1200 OR $699 is NOT enough. Period. And Nvidia knows this. In fact, they're counting on it and it absolutely sucks, because you'll basically need to replace your GPU when you run out of VRAM, which could happen HMMM, oh I don't know, maybeeeee when the next AAA game releases in a year or two? But of course we can't know that for sure.. 

Please note: I'm talking about gaming at 1440p or higher here.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, gal-m said:

Let me tell you a little story @Mister Woof:

The year was 2012. I had just bought a 3930k, 16GB's of 2133MHz DDR3 RAM, an ASUS Rampage IV Formula motherboard, together with an SSD etc. etc.

Basically a VERY high end setup for gaming and casual workstation related tasks in 2012, no doubt.

 

NOOO they screamed and told me: HAHAHHA you dummy. By the time games use more than 2GB of VRAM the actual GPU will be waaay too old.

 

The date was September 17th, 2013.

ONE, let me repeat - ONE!!!! year later. And my 6 core 12 thread machine overclocked to 4.6GHz water cooled with 16GB of quad channel memory and an ASUS GTX 680 Direct CU II couldn't max out GTA V @1080p, because guess what... it did not have enough VRAM.. 

 

Let me just remind everyone that the GTX 680 was basically Nvidia's best single GPU graphics card offering out there in 2012. If you wanted 4GB of VRAM you needed to buy something like a GTX 690, which was a DUAL GPU card. I believe you could also get your hands on a GTX 680 4GB version, but those were expensive and nobody needed more than 2GB of VRAM right?

 

Btw, @Mister Woof please don't take this the wrong way. I am in no way trying to make my comment seem aggressive or trying to be rude, trust me, but let's get serious here for a second.

A premium high end configuration in 2012 SHOULD have been able to play AAA games at max settings for at least a couple of years - and that is the bottom line. I know, I know - Moore's law comments incoming. HOWEVER: if you PAY for ultra high end performance, you EXPECT ultra high end performance and nothing can save you from feeling a bit ripped off.

 

After the GTX 680 came the GTX 780 with DOUBLE the VRAM (4GB).

The way things stand today: mid - high end cards with 8GB of VRAM have been out for 4 years now. 10GB's in a card worth $1200 OR $699 is NOT enough. Period. And Nvidia knows this. In fact, they're counting on it and it absolutely sucks, because you'll basically need to replace your GPU when you run out of VRAM, which could happen HMMM, oh I don't know, maybeeeee when the next AAA game releases in a year or two? But of course we can't know that for sure.. 

Please note: I'm talking about gaming at 1440p or higher here.

If vram mattered then the 6800xt would be beating the 3080 at 4k but it just doesn't, because it's too slow.

 

So at least for now, it's enough. And the 6800xt being slower than the 3080 at 4k is an indicator that both will be too slow once 10gb isn't enough.

 

And by the time they are starting to show their age, they will likely need to reduce settings anyway to keep up regardless of vram, and by extension the vram requirements will go down for those settings level.

 

I see your example but that's not necessarily relevant and I disagree. Because you said 10gb isn't enough - it clearly is. And more vram isn't helping the 6800xt overtake the 3080 at 4k.  You can theorize all you want a about future requirements, but it's only ever going to be a guess. We can only look at what data we have now. 

 

I'd also rather spend $700 now for a gpu that does well now, and then another $700 in a few years for what's fast in a few years, than to spend $1400 today for something that's overkill for now and likely not good enough for tomorrow anyway. (here's looking at you, titan XP/2080ti/rtc titan)

 

Is the extra cram in the TitanXP making a difference?

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

By the time the 3080 doesnt have enough memory the 4080 will be launching.  Id bet money on that.

 

As has been said so many times, you will run into FPS limitations in terms of what the GPU can actually do before the VRAM is touched.  The most hardcore of games still haven't been an issue for the 10gb on the 3080.  

 

A game that would actually require 16gb VRAM would likely be unplayable on the 6800XT or 3080.  They just dont have the GPU horsepower to do 4k, 5k, 8k with those extremely demanding textures....and those are far above the consoles (*which limit progression of graphics and textures between generations due to developers making games console friendly)

 

The GPU on the NEW consoles is somewhere around an RTX 2080/3060ti (probably a little below).  Developers are not going to make a game that requires 16gb VRAM because it will run at 5-7 FPS on the new XBOX and PS5 (even with their upscaling / checkerboard rendering).  They just dont have the horsepower to push 4k 16gb VRAM demanding textures.  And they just launched.  The 3080 is already far ahead of the consoles (as is the 6800XT).  Either is fine, but VRAM really is not a limiting factor at this point.  

 

 

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Mister Woof said:

I'd also rather spend $700 now for a gpu that does well now, and then another $700 in a few years

Yeah and that's the problem.

5 hours ago, Zberg said:

By the time the 3080 doesnt have enough memory the 4080 will be launching.

Again - that IS the issue.

 

What Nvidia wants is for us to behave like little kiddos buying up every new iPhone that comes out. We (sensible, financially responsible adults) usually don't fall for companies scamming us into buying their new sh*t every year, but due to limitations (VRAM) we're going to be forced to.

When I buy a high end product I expect that high end product to LAST, otherwise I wouldn't be buying a high end product.

When I buy an RTX 3080 I expect to be playing games at 1440p maxed out today and gradually start lowering my settings in the years to come, without being PHYSICALLY limited (there not being enough VRAM chips on the board). In the example mentioned above, my actual graphics chip, the GK104, was not really the limiting factor. I consider 30-40 FPS to be completely acceptable in 5,6 years with decent settings, but, say it with me now @Mister Woof & @Zberg, I WAS BEING LIMITED BY THE VRAM, so my games didn't run that well, even though the actual chip was perfectly capable of handling them. PLEASE NOTE: when I say maxed out I'm excluding gimmicky options like ray tracing, since the visual improvements totally aren't worth the cost in FPS (in my opinion) - not that that was a thing back in the day, but just so we're clear.

 

I personally don't buy a GPU every two years, because it's a total waste of money + it generates tonnes of e-waste. I also don't buy a new phone every year because it makes no sense, if the one I have is fine. I also HATE companies forcing people to buy new, when old is or would be (if they weren't purposefully limiting us *cough* VRAM *cough*) completely sufficient.

 

Take my 9 year old PC for example: 3930k, 16GB, (currently) RX 580 (which was a $100 upgrade in 2019). It is 9 years old and it runs fine at 1080p. Yeah it's time for an upgrade, but I got NINE excellent years out of it. NINE!

7 hours ago, Mister Woof said:

You can theorize all you want a about future requirements, but it's only ever going to be a guess. We can only look at what data we have now. 

Agreed and this is true, HOWEVER..

The point I'm trying to make @Mister Woof & @Zberg is that I think nGREEDia's behaviour is scummy. I personally find it a bit ridiculous that I'm thinking of potential upgrades in the NEAR future, straight after buying a new HIGH END GPU for $1200, and yes $1200 because THAT is what they're going for. Whether you like it or not we're looking at aprox. $900 MSRP's at the moment and not $699.

 

I will close on this: I am speculating here, no doubt, I don't know what games are coming out and I don't know what the requirements are going to be to play them. HOWEVER I am a strong believer that history has a tendency of repeating itself and I REALLY wouldn't be surprised if it does so again.

  • What does that mean? Well, I think that your perfectly capable RTX 3080 is going to get limited by VRAM, thus not being able to perform even though it COULD.
  • What you will do? Throw your hard earned cash at companies that are TAKING ADVANTAGE of you.

I think this is funny and COMPLETELY disagree with it.

 

I have always expected to get around 6 years from an 80 series Nvidia GPU e.g. GTX 680 = GTX 770 = GTX 960 = GTX 1050. A new generation comes out every one or two years, so we're looking at around 6 - 8 years, why? BECAUSE I PAID FOR IT.

Otherwise high end GPU's just don't make sense and everyone should just buy an RTX 3070 😅... (I see counter arguments to that incoming, don't even try hahaha)

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is your expectations.

 

Everything performance related is deminishing returns.

 

The best value is everyone buys a $250 60- series GPU every couple years to play 1080p/60.

 

Performance above that costs money, it always has, and always will.

 

My recommendation is to stop buying expensive gpus because it's not right for you. The 80 series card isn't the right card for you.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mister Woof said:

The problem is your expectations

You could say that, yeah.. but no.. not really. The problem is me refusing to getting taken advantage of @Mister Woof.

 

P.S. I made a quick edit of the comment above, but it's still essentially the same.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Mister Woof said:

My recommendation is to stop buying expensive gpus because it's not right for you. The 80 series card isn't the right card for you.

So you're saying I can't afford it? How can you say that? You don't know me. 

See?

 

Again, the issue is GREEDY scummy companies. That IS the issue. If Nvidia made an RTX 3080 with e.g. 12GB or 14GB of ram, I wouldn't complain at all.

Link to comment
Share on other sites

Link to post
Share on other sites

I edited mine as well.

 

Buying an 80 series gpu isn't for future proofing, because that philosophy is naive and almost always a  primer for disappointment.

 

Buy what you can afford for today for the performance you want today.

 

Worry about tomorrow tomorrow.

 

If money is an issue, there's no reason you should be even considering the 80 series anyway.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, gal-m said:

So you're saying I can't afford it? How can you say that? You don't know me. 

See?

 

Again, the issue is GREEDY scummy companies. That IS the issue. If Nvidia made an RTX 3080 with e.g. 12GB or 14GB of ram, I wouldn't complain at all.

I'm  saying you won't be satisfied because you expect it to play maximum level for 6 years. 

 

And that's your problem, not Nvidia's.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

Basically @Mister Woof I expected you to take the easy way out and try to shut me down with the old: if you can't afford it, stop complaining, because you KNOW we could be getting a bit more for our money, but again you're really not understanding my point.

The problem is that we as consumers keep throwing our money at companies taking advantage of us. They're milking us AGAIN and AGAIN. I don't think that's right.

2 minutes ago, Mister Woof said:

Buying an 80 series gpu isn't for future proofing, because that philosophy is naive and almost always a  primer for disappointment

I do agree with that, to an extent, but that's a topic for another discussion.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mister Woof said:

you won't be satisfied because you expect it to play maximum level for 6 years.

Have you read what I wrote @Mister Woof?

 

Here:

15 minutes ago, gal-m said:

I consider 30-40 FPS to be completely acceptable in 5,6 years with decent settings, but, say it with me now @Mister Woof & @Zberg, I WAS BEING LIMITED BY THE VRAM, so my games didn't run that well, even though the actual chip was perfectly capable of handling them.

 

Please read my post.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, gal-m said:

Basically @Mister Woof I expected you to take the easy way out and try to shut me down with the old: if you can't afford it, stop complaining, because you KNOW we could be getting a bit more for our money, but again you're really not understanding my point.

The problem is that we as consumers keep throwing our money at companies taking advantage of us. They're milking us AGAIN and AGAIN. I don't think that's right.

I do agree with that, to an extent, but that's a topic for another discussion.

Look I'm a grown up, and I'm guessing you might be one too.

 

I don't bitch and moan about people spending $100k on a luxury car, tell them they should spend less on a Honda, and then start talking about BMW poor resale value and maintenance costs as if that matters to them.

 

You know how much these cards cost, I know how much these cards cost, and we both have our own opinions about there performance and expected useful life. 

 

As for me, 30-40fps is  unacceptable. I'll play on low or lower resolution scale before I give up FPS. And at that point, VRAM amount is meaningless. So, that's my opinion on expected performance.

 

Your angle is one of someone who just wants to prove he's correct on an internet forum. 

 

Obviously, you don't want these cards. Don't buy them. Problem solved. Berating others for their personal choices, and their logical experience based evaluations, is... Weird?

 

Just don't buy it if you don't want it.

 

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Mister Woof said:

I don't bitch and moan about people spending $100k on a luxury car, tell them they should spend less on a Honda

I'm not bitching and moaning about you or whoever buying an RTX 3080 or 3090. In fact, I'm the one usually telling people that buying high end is an awesome choice if you can afford it. Seriously. Like I tell people to stop telling other people what they don't need and if they just want to buy it, they should.

 

7 minutes ago, Mister Woof said:

Your angle is one of someone who just wants to prove he's correct on an internet forum

No.

 

Again @Mister Woof I'm really sorry, but you're not understanding me.

I'm not talking about your decisions or the decisions of people who buy an RTX 3080.

I'm talking about how companies IN MY OPINION should pay more respect to the consumer.. 

 

Either way I see you have started becoming a bit aggressive towards me, so I'm going to suggest we leave this discussion alone, otherwise it's just going to get locked anyways.

Link to comment
Share on other sites

Link to post
Share on other sites

Enjoy ignore, you appear to just be trolling.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/14/2020 at 2:47 PM, Longshot said:

i know the 3080 ti hasnt launched yet or even been announced but with all the leaks of it being between the 3080 and 3090 and having 20gb of vram

 

As matter of fact there are leaks that say that the 3080 Ti was canceled.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Vishera said:

As matter of fact there are leaks that say that the 3080 Ti was canceled.

My bet is they will launch in fall as a refresh like the Super lineup launched with a refresh namiing scheme.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mister Woof said:

My bet is they will launch in fall as a refresh like the Super lineup launched with a refresh banking scheme.

It's possible,but waiting for something that possibly won't even come into existence is not a good idea.

A PC Enthusiast since 2011
AMD Ryzen 7 5700X@4.65GHz | GIGABYTE GTX 1660 GAMING OC @ Core 2085MHz Memory 5000MHz
Cinebench R23: 15669cb | Unigine Superposition 1080p Extreme: 3566
Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Vishera said:

It's possible,but waiting for something that possibly won't even come into existence is not a good idea.

At this point in time, if you want to game, get the first available current MSRP GPU that fits your current performance requirements and budget.

 

Even last gen and last last gen stuff is overpriced, so you can't even get a hold me over until then without throwing your money away.

 

A year ago coulda grabbed a 5700xt for $360, and it would have likely held over most users for 1440p/1080p until all this crap boiled over.

 

But now....most markets don't have the luxury to pick and choose what card to buy.

 

 

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 month later...

Just my 2 cents the Ti won't be enough ompff to future proof for that long at max everything- not when Cyber punk already makes even the 3090 just pass in 4k. Hell dual 3090s in SLI had great FPS but not crazy through the roof FPS when LTT played SOTTR. and that's with 2 Gdamned 3090s.

 

So for the cost upcharge i'd pass- buy a share of tesla, sell it in 2 years and get a "free" top of the line card then. Same reasoning behind going 3080 vs. 3090 yup 3090 is better but not 80% more $$$ better.

Also, full agree this is greed based practices but those wont change- if they literally cannot make enough cards to sell they know $$$ is a non issue and will do whatever they want- the only think that surprises me is given the shortages why they don't just only focus on making 3080/3090s for more profit per limited units available to sell. 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 weeks later...

Precision X1 had a failed updated, rebooted and i was greeted with a message.. So from the looks of it.. its out.

IMG-0683.JPG

IMG-0684.JPG

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×