Jump to content

ASUS Has Listed the RTX 3080 Ti 20GB and RTX 3060 12GB on Their Service and Support Website

Random_Person1234

Summary

A listing of the RTX 3080 Ti 20GB and RTX 3060 12GB has been spotted on ASUS's service and support website. This is the first manufacturer to confirm the existence of such a card.

Spoiler

ASUS-ROG-STRIX-3080-TI-1.png

Quotes

Quote

The manufacturer becomes the first to confirm that such NVIDIA SKU even exists. The listing reveals that two ROG STRIX graphics cards are in development: ROG-STRIX-RTX3080TI-O20G-GAMING (with factory-overclocking) and ROG-STRIX-RTX3080TI-20G-GAMING (likely to feature reference clock speeds).

Quote

The listing also confirms that ASUS will launch RTX 3060 with 12GB memory. The company is expected to launch two ROG STRIX SKUs as well. The RTX 3060 12GB is the only SKU that was still planned for late January launch.

My thoughts

I wonder why Nvidia decided to strap 12GB of VRAM to the 3060. 

 

Sources

https://videocardz.com/newz/asus-confirms-rog-strix-geforce-rtx-3080-ti-graphics-card-with-20gb-memory

https://rog.asus.com/support

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

It'd be interesting to see what they'll price the RTX 3080Ti at, if it's competing with the 6800XT or the 6900XT, it'd be a smack in the face for all consumers if they priced it slightly higher than the 3080 but competition is always welcome, an extra 10GB of VRAM is huge, and it's even more concerning for the RTX 3090 users which is expected I guess

Quote or Tag people so they know that you've replied.

Link to comment
Share on other sites

Link to post
Share on other sites

Didn't MSI mention this before?

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Random_Person1234 said:

I wonder why Nvidia decided to strap 12GB of VRAM to the 3060. 

They'll probably release a 3060 with 6 or 8 GB of VRAM like with the 1060 6GB & 3GB variants. This is a bigger difference than 3GB and 6GB though, so I do wonder why didn't they go with 10GB for example.

The more I learn, the more I realise I don't actually know anything. 

 

Recommendations: Lian Li 205m (sleek, pretty decent airflow for a non-mesh front panel and cheap), i5-10400f (Ryzen 5 3600 performance, 20% cheaper), Arctic P14 PWM fans, Logitech g305.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tishous said:

This is a bigger difference than 3GB and 6GB though, so I do wonder why didn't they go with 10GB for example.

Probably because its easier to replace 1GB modules with 2GB modules instead of adding more module mounting points.

 

 

Spoiler

CPU: Intel i7 6850K

GPU: nVidia GTX 1080Ti (ZoTaC AMP! Extreme)

Motherboard: Gigabyte X99-UltraGaming

RAM: 16GB (2x 8GB) 3000Mhz EVGA SuperSC DDR4

Case: RaidMax Delta I

PSU: ThermalTake DPS-G 750W 80+ Gold

Monitor: Samsung 32" UJ590 UHD

Keyboard: Corsair K70

Mouse: Corsair Scimitar

Audio: Logitech Z200 (desktop); Roland RH-300 (headphones)

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Random_Person1234 said:

Quotes

My thoughts

I wonder why Nvidia decided to strap 12GB of VRAM to the 3060. 

 

 

I could see the VRAM being useful to production work. Blender, for instance, has come leaps and bounds the past couple years, and performs faster on GeForce cards than Quadro cards. (Outside CAD, Quadro cards aren’t worthwhile over a GeForce card), This could be a move to have a cheaper production card. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

I've ordered RTX 3080 yesterday as I spotted a sneaky stock at a local computer shop, coz I'm tired of waiting, I also have 1440p monitor on the way and these moar VRAM versions will be just as unobtainable for months as these were. Thanks to magic of VAT and availability, most cost close to or over 900€ anyways (similar to GTX 1080Ti years ago). I'm paying a bit more, but still under 4 figures. Coz f**k it. I should be fine with 10GB for 1440p and I don't think I'll be regretting it.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, The1Dickens said:

Probably because its easier to replace 1GB modules with 2GB modules instead of adding more module mounting points.

 

 

That is true.

 

I expect to see the 3080ti and 3060 12GB be 25% more expensive than the 3080 and 3060 6GB respectively. That's what I would guess anyway.

The more I learn, the more I realise I don't actually know anything. 

 

Recommendations: Lian Li 205m (sleek, pretty decent airflow for a non-mesh front panel and cheap), i5-10400f (Ryzen 5 3600 performance, 20% cheaper), Arctic P14 PWM fans, Logitech g305.

 

Link to comment
Share on other sites

Link to post
Share on other sites

So they added an extra 4GB to the 3060, eh...

Might be good for above 4k VR gaming. Then again, while the 3060 is still relatively powerful (2080 Super equivalent, more or less), it's not exactly a "beast" either. So it's questionable as to why they decided to do that with the 3060 and not the 3070.

Though of course, that's just gaming, it will likely still be useful in production work.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

It will be interesting to see what performance is like IF they don't change anything about the clock speed and core count.

That way we should be able to measure how much of an impact VRAM actually has on performance. My guess is that it is way less (for gaming) than most people think.

Remember, unused VRAM is wasted VRAM, and things like "memory usage" in task manager does not accurately show how much VRAM is actually in active use (because it could be old data that the computer hasn't bothered flushing yet).

Link to comment
Share on other sites

Link to post
Share on other sites

Bring on the 3080Ti.
I decided to wait out the 3080/3090 for a better card, based on rumors. Looks like I was right this time. (Now the hard part begins, trying to buy one.) 😖

I'm still not sure if it slots in between the 3080 and 3090 or is about the same level as the 3090 or better. I'm hoping it is at the 3090 level of perf, if better then that's just a bonus to me.

CORSAIR RIPPER: AMD 3970X - 3080TI & 2080TI - 64GB Ram - 2.5TB NVME SSD's - 35" G-Sync 120hz 1440P
MFB (Mining/Folding/Boinc): AMD 1600 - 3080 & 1080Ti - 16GB Ram - 240GB SSD
Dell OPTIPLEX:  Intel i5 6500 - 8GB Ram - 256GB SSD

PC & CONSOLE GAMER
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, TetraSky said:

Though of course, that's just gaming, it will likely still be useful in production work.

I think this is meant for production work yes, to be sort of like the "low-budget 3090/3080 TI" for content creators with a smaller budget.

 

It won't make a difference in gaming I'd say, the card is not close to powerful enough to reach VRAM limits (I mean neither is the 3080 in everything I've seen).

1 hour ago, RejZoR said:

I'm paying a bit more, but still under 4 figures. Coz f**k it. I should be fine with 10GB for 1440p and I don't think I'll be regretting it.

You won't be, 10GBs has been more than enough at 4K for everything I've done (which is only MSFS, I have a 1440p monitor but I run it at 1440p 150 render scaling and have tried manually setting 4K as well). 

 

 

There's also obviously more to VRAM than just capacity, the 12GBs on the 3060 would not equal the 10 GBs on the 3080, the bandwidth is super different (3060 has 192-bit bus of G6 I believe, compared to 320 bit G6X)

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

why the F****** are they messing with the Vram and on some of the cards that maybe not need it?

like 3060 vram 8 GB, 3070 vram 8 GB, 3080 vram 10 GB

would rather wanted 3060ti 8, 3070 10, 3080 12 (although if it's going X type of memory or not)

then AMD having like, was it 16 GB on all cards?

Now this? Just adding a bunch of vram like it was a joke?

Which means that the 3070 might not be full future proof, and it's between 3060 and 3080?

and depending on the prices on these new GPUs?

 

On some games there is like 6-7 GB of vram on highest settings, but if they increased the demand it could easily go beyond?

If they begin this data streaming stuff, from the unreal engine demo and loading in assets while playing/looking at it.

While in certain demands 8GB could be more than enough,and some use 4 GB to 6 GB at 1080p which becomes a bit more in higher res.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Quackers101 said:

why the F****** are they messing with the Vram and on some of the cards that maybe not need it?

like 3060 vram 8 GB, 3070 vram 8 GB, 3080 vram 10 GB

would rather wanted 3060ti 8, 3070 10, 3080 12 (although if it's going X type of memory or not)

then AMD having like, was it 16 GB on all cards?

Now this? Just adding a bunch of vram like it was a joke?

Which means that the 3070 might not be full future proof, and it's between 3060 and 3080?

and depending on the prices on these new GPUs?

 

On some games there is like 6-7 GB of vram on highest settings, but if they increased the demand it could easily go beyond?

If they begin this data streaming stuff, from the unreal engine demo and loading in assets while playing/looking at it.

While in certain demands 8GB could be more than enough,and some use 4 GB to 6 GB at 1080p which becomes a bit more in higher res.

I'm not gonna complain with higher vram on cards. I was already apprehensive about buying the 3080 as it only had 10 gb of vram which is ok for 4k today for most games but probably not the case in 2 to 3 years from now. I would rather have 20 gb of vram and not have to worry. 

Link to comment
Share on other sites

Link to post
Share on other sites

The reason why the 3060 is getting 12 GB VRAM is the same reason the 3080 Ti is getting released.

 

AMD.

 

Nvidia most likely has found out how much VRAM the budget AMD cards (6700/600) are going to be getting and upped the VRAM to beat their numbers. I see people here talking about what the cards "need" but these things have nothing to do with what is "needed". They need to make sure that their cards have "bigger numbers" than the competitor cards irrelvant of what is "needed" which is the only reason Nvidia is releasing the 3080 Ti. The vast majority of people have no idea whatsoever what VRAM is so when they compare products they just look at whatever the "bigger number" is.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Brooksie359 said:

I'm not gonna complain with higher vram on cards. I was already apprehensive about buying the 3080 as it only had 10 gb of vram which is ok for 4k today for most games but probably not the case in 2 to 3 years from now. I would rather have 20 gb of vram and not have to worry. 

I'm just finding it annoying, when AMD can just add a lot. And now Nvidia is just adding way more, when nvidia likely knew that.

I don't quite care that much vram, but when it's a newer generation and they don't have a bit more future proofing on something one can't upgrade?

I just get a bit annoyed with Nvidia. To when they also did with super etc, sometimes releasing a better version of that card for about the same price or making things confusing.

Like others said, that much vram might not matter if your GPU can't handle using all that for like games, maybe things change in the future with how things work.

For 3070 and above, they would at least more room with just a few more GB anyways and how much that would cost if the cards with +10 GB might not be that more or depending on the price of those when released or just making the 3080 and 3060 or lower in a better light?

 

And for the comment above, that can also be taken another way if they want to cash in more.
If consumers then hear about Vram being an issue, they would pay the extra for such vram too.

I noticed that with my 1000 series GTX, that now the 8 GB is quite handy even though it might be just a GB or so above what it actually needs?

Until I might join the RTX series.

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Gamer Schnitzel said:

The reason why the 3060 is getting 12 GB VRAM is the same reason the 3080 Ti is getting released.

 

AMD.

 

Nvidia most likely has found out how much VRAM the budget AMD cards (6700/600) are going to be getting and upped the VRAM to beat their numbers. I see people here talking about what the cards "need" but these things have nothing to do with what is "needed". They need to make sure that their cards have "bigger numbers" than the competitor cards irrelvant of what is "needed" which is the only reason Nvidia is releasing the 3080 Ti. The vast majority of people have no idea whatsoever what VRAM is so when they compare products they just look at whatever the "bigger number" is.

It isn't necessarily "bigger numbers". With a non-competitive AMD, Nvidia can make segmenting mistakes and not blow up their company. (Not that Turing wasn't a sales disaster in the first year.) The issue with Ampere is that Nvidia's projections ended up unbalanced, while AMD hit their projections really well. Nvidia historically favors less VRAM (because less money in total) with higher bandwidth memory channels. AMD has taken the wider/more VRAM approach for a while. The issue is that "more VRAM" basically always wins out in the consumer space for any one that plans to hold onto a card for a while. Especially for something like the Radeon 6800 non-XT at 16 Gb.

 

While it's a bit cheeky, we've already seen AMD exploit this issue with Godfall. At 4k Ultra, it uses 12 Gb of VRAM, which puts a performance delta in AMD's direction just flat-out. While 4K Ultra isn't where anyone should be playing, it's a standard test resolution/setting for benchmarking. To make matters worse, when Raytracing is used, it uses more VRAM. Nvidia can't just quickly add more memory channels to a die. That would take a brand new die, which would take a year. What they can do is double the amount of chips per channel (or run double sized memory chips). That holds off the problem until their next designs get out.

 

While bigger numbers matter a lot (AMD sold a LOT of Threadrippers because 16 >10, as shown on the forums here), there is very practical considerations because Game Devs tend to overdo it on texture sizes, while at the same time AMD has been selling 8 Gb cards below 150USD for a long time now. Nvidia needed another memory channel and just GDDR6 on the mid-range cards.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Taf the Ghost said:

While it's a bit cheeky, we've already seen AMD exploit this issue with Godfall. At 4k Ultra, it uses 12 Gb of VRAM, which puts a performance delta in AMD's direction just flat-out. While 4K Ultra isn't where anyone should be playing, it's a standard test resolution/setting for benchmarking. To make matters worse, when Raytracing is used, it uses more VRAM. Nvidia can't just quickly add more memory channels to a die. That would take a brand new die, which would take a year. What they can do is double the amount of chips per channel (or run double sized memory chips). That holds off the problem until their next designs get out.

There has to be some positive/negatives to either solution, I guess.

but I wonder, is it going to help them somewhat with storing more data in Vram too?

If you have seen it doing any different when in 4K or if there is a benchmark for things like that?

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Quackers101 said:

There has to be some positive/negatives to either solution, I guess.

but I wonder, is it going to help them somewhat with storing more data in Vram too?

If you have seen it doing any different when in 4K or if there is a benchmark for things like that?

Getting accurate active in use VRAM measurements is very difficult but the easiest way to detect if you do not have enough is frame time plotting as well as 0.1% lows dropping considerably. When you start getting a lot of frame time spikes that is a very good sign the active VRAM is greater than the capacity you have, turning down textures is the way to confirm this if the spikes go away (0.1% will also recover to normal).

 

There are games that use more than 8GB active, and also 10GB active, but it's really not that common. Problem is it will be more common, I personally still don't think it's that great of an idea to be playing at 4k on any GPU currently but w/e people can do what they like. But if you don't use 4k then 3080 10GB will not be a problem.

 

The positive is it prevents performance degradation (more VRAM without more bandwidth does not increase performance) by providing enough VRAM to the active workload, the negative is cost. Another negative does exist but it's not that big of one and only applied if they do it but that would be back side VRAM modules which makes cooling more complicated, cards already do this.

 

As much as it shouldn't ever effect me the 10GB 3080 is a card I will not buy simply because it only has 10GB. That's the great thing about competition, there is another equal performing product with 16GB which for how long I typically keep my GPU for this makes it more appealing than the current Nvidia options. A 20GB 3080 might change that but I suspect it'll price itself out of consideration compared to the 6800 XT. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/26/2020 at 1:15 PM, leadeater said:

Getting accurate active in use VRAM measurements is very difficult but the easiest way to detect if you do not have enough is frame time plotting as well as 0.1% lows dropping considerably. When you start getting a lot of frame time spikes that is a very good sign the active VRAM is greater than the capacity you have, turning down textures is the way to confirm this if the spikes go away (0.1% will also recover to normal).

Thats interesting... my way of checking this is (kinda) similar... first I compare what the in game "estimate" says (if applicable) then I check what Afterburner etc say... often a very similar number to what is estimated as the max usage, and then I'd simply crank up settings that it brings me over or near to my actual VRAM and, without fail, as soon I go above my games will start lagging, hanging, crashing... (in no particular order) 

 

Of course this "evidence" isn't enough to convince the naysayers (to be fair, nothing would, probably...) 

 

But anyway, @SkilledRebuilds posted this Afterburner screen yesterday in another thread ... I haven't seen that option and I cannot activate it, would be interesting to know how...? (maybe it's an AMD feature, maybe it's only certain cards, idk) 

 

Untitled.thumb.png.602ea1b6d54691e51fd7a7568fd4f552.png.b4885458fad42cf465740c750b02c1fe.png

 

 

🤔 

 

 

I would really like to activate this, simply to compare (I'm pretty sure, as per my own tests, this already works pretty well with the default settings) but I'd like to confirm this for myself, also I'm curious how the frig you even get this option to show up!  😅

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Okay... so while MY GPU1 VRAM Dedicated is IN USE and working,. it's also now greyed out for me... (WTF right? I remember enabling it, selecting GPU1 VRAM Dedicated) but now I can't go back into it.
So here are my CFG file settings to hopefully trick it into being displayed anyway :)
But this should enable the Dedicated VRAM USE section to MSI AB.


SO..
My MSIAfterburner.cfg (Rightclick MSIAB Shortcut and open file location)
It should open the PROFILES Directory, open up MSIAfterburner.cfg (notepad) and try MANUALLY Pasting my values in there (After Reading OFC)
Or just don't do it as I have NFI of the issues it can cause (altho I doubt there are any) as I'm only adding the value and then enabling the value.

Spoiler

This May or MAY NOT Work, hope it does and helps.

FIRST!
Probably important.
When you OPEN that CFG (Notepad), scroll down JUST a LITTLE BIT until you hit the [Sources= <---this bit) Manylinesof+values and a comma]
Paste the "+GPU1 dedicated memory usage," <--no quotes, keep the + and ,

Add this to the end of the sources list.

 

With the CFG file in Notepad,..
SEARCH FOR" [Monitoring] " to find the section below.
The GPU.dll=1 is the one I chose for it to be enabled originally, after clicking those 3 dots (greyed out now)


This GPU1 DMU option may appear once you paste this into your cfg (inc below), and relaunch MSI Afterburner.
I recommend also making a backup of your original CFG (Copy paste & rename original) before editing.


Copy the BELOW into your MSI Afterburner.cfg after you FIND the Monitoring section to add these values.
(If the value is blank, leave it blank as mine is set) What else..... Oh yeah, you can edit the layout and all the needed stuff once it's enabled..as normal (I hope)


[Monitoring]
CPU.dll=1
GPU.dll=1
PerfCounter.dll=1
AIDA64.dll=1
[Source GPU1 dedicated memory usage]
ShowInOSD=1
ShowInLCD=0
ShowInTray=0
AlarmThresholdMin=
AlarmThresholdMax=
AlarmFlags=0
AlarmTimeout=5000
AlarmApp=
AlarmAppCmdLine=
EnableDataFiltering=0
MaxLimit=16000
MinLimit=0
Group=VRAM ACTUAL
Name=
TrayTextColor=FF0000h
TrayIconType=0
OSDItemType=0
GraphColor=00FF00h
Formula=

Please ASK Q's to me and others if you have never done this stuff before,..as I don't want to be known for ruining your shizzle.

I don't even know if this will work for you so... have at it.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

@SkilledRebuildsthanks, I'll give this a try later today...!

 

(and I'll ask if something is unclear, I did my fair share of config file /exe / mod files / Steam client / skins / hex editing / hacking, but I'm not actually very skilled at it, I do however usually make backups so I can go back easily if something goes really wrong :p) 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/26/2020 at 6:15 AM, leadeater said:

Getting accurate active in use VRAM measurements is very difficult but the easiest way to detect if you do not have enough is frame time plotting as well as 0.1% lows dropping considerably. When you start getting a lot of frame time spikes that is a very good sign the active VRAM is greater than the capacity you have, turning down textures is the way to confirm this if the spikes go away (0.1% will also recover to normal).

 

There are games that use more than 8GB active, and also 10GB active, but it's really not that common. Problem is it will be more common, I personally still don't think it's that great of an idea to be playing at 4k on any GPU currently but w/e people can do what they like. But if you don't use 4k then 3080 10GB will not be a problem.

 

The positive is it prevents performance degradation (more VRAM without more bandwidth does not increase performance) by providing enough VRAM to the active workload, the negative is cost. Another negative does exist but it's not that big of one and only applied if they do it but that would be back side VRAM modules which makes cooling more complicated, cards already do this.

 

As much as it shouldn't ever effect me the 10GB 3080 is a card I will not buy simply because it only has 10GB. That's the great thing about competition, there is another equal performing product with 16GB which for how long I typically keep my GPU for this makes it more appealing than the current Nvidia options. A 20GB 3080 might change that but I suspect it'll price itself out of consideration compared to the 6800 XT. 

i wish to see if these so called compression techniques actually do work in majority of situations

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/25/2020 at 11:55 PM, RejZoR said:

I've ordered RTX 3080 yesterday as I spotted a sneaky stock at a local computer shop, coz I'm tired of waiting, I also have 1440p monitor on the way and these moar VRAM versions will be just as unobtainable for months as these were. Thanks to magic of VAT and availability, most cost close to or over 900€ anyways (similar to GTX 1080Ti years ago). I'm paying a bit more, but still under 4 figures. Coz f**k it. I should be fine with 10GB for 1440p and I don't think I'll be regretting it.

Right now my 3080 doesn't have any trouble even at 4K in Cyberpunk. I didn't yet manage to find a game where the 10GB were limiting performance by any degree. I also paid 900€ for mine because i didn't want to wait half a year after release. I didn't regret it yet and playing games like Control and Cyberpunk with RT enabled is just awesome ;). Have fun with yours!

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann said:

Right now my 3080 doesn't have any trouble even at 4K in Cyberpunk. I didn't yet manage to find a game where the 10GB were limiting performance by any degree. I also paid 900€ for mine because i didn't want to wait half a year after release. I didn't regret it yet and playing games like Control and Cyberpunk will RT enabled is just awesome ;). Have fun with yours!

I ended up with Palit RTX 3080 GamingPro which should arrive tomorrow if all goes well. Not quite the card I wanted because call me petty, but I really dislike the name Palit lol, but I figured it doesn't matter anyways as all RTX 3080's are the same and if it's the only I can get for somewhat "reasonable" price, so be it. I do like the honeycomb airflow passthrough through the PCB and backplate which is one of nicest around. Mine was 990€. It's "a bit more" expensive, but they are almost all over 900€ over here. If you can even get one. I said I'm not gonna pay 4 digit figure. So, here we are then.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×