Jump to content

Radeon 6900 xt vs RTX 3080?

Longshot
Go to solution Solved by Briggsy,
3 hours ago, dickjack said:

Why do people say AMD cards ages better than Nvidia? I would assume the card that has better performance would age better (?) 

Its a curious notion for sure. 

 

Back when AMD were overbuilding their hardware, it took months and sometimes years for the drivers to catch up, and so the fanboys fell back on the idea that AMD performance improved over time like fine wine, instead of acknowledging that AMD couldn't get their drivers optimized in a timely fashion for AAA games. 

 

The fine wine argument is a reaction to AMD having slow driver releases, because if AMD didn't have slow driver releases then there wouldn't be any "fine wine" improvements later. You can't have both, they're mutually exclusive.

 

The only other aspect is the amount of VRAM that AMD uses compared to Nvidia. Go all the way back to GCN 1.1 with the R9 290 and there were 4GB and 8GB variants, while Nvidia was playing around with 3GB 780s and 6GB Titans. As far back as I can remember, AMD have always had more VRAM. I think the VRAM size might be the only meaningful way that AMD cards could age better, but at some point all the VRAM in the world isn't going to give you more performance, and in my own testing Nvidia manages VRAM usage better than AMD does, which means AMD having more VRAM might simply be to compensate for less aggressive memory management. 

Hi so somehow two stores I ordered from, one the 6900xt and one the 3080 (I'm getting one and cancelling the other) have just both told me the card is available for pickup

 

I have an 850 watt power supply and I have no idea what to choose.

 

Any tips please?

 

 

My new set up:

CPU: RYZEN 7 5800X

PSU: Gigabyte 850 Aorus 80+ Gold

MOTHERBOARD: Asus x570 Plus

RAM: Ripjaw 32 gb 3200 mghz

Monitor: 1440p 144hz Dell

Case: be quiet dark base 500dx

 

Main Pc Use:

- Gaming

- Game Development

- Video and photo editing

- General Browsing the web

Link to comment
Share on other sites

Link to post
Share on other sites

I thought the 6900 XT was on par with a 3090, are you sure you don't mean the 6800 XT?

Either way your PSU is fine for either, if you don't care about ray tracing in a 6900 XT vs 3080 I'd say the 6900 wins easily.

 

That said the 6900's MSRP is 300€ more than the 3080, so it should be normal that it beats that card.

Link to comment
Share on other sites

Link to post
Share on other sites

What exact PSU? LTT managed to trip the Overcurrent protection on a seasonic 1000W unit with a 6900XT.

 

Also it depends if you care about RTX and DLSS. When DLSS is enabled in supported games the 6900XT is left in the dust. And in ray-tracing titles the 6900XT will suffer much more than the NVidia option.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Stahlmann said:

What exact PSU? LTT managed to trip the Overcurrent protection on a seasonic 1000W unit with a 6900XT.

 

Also it depends if you care about RTX and DLSS. When DLSS is enabled in supported games the 6900XT is left in the dust. And in ray-tracing titles the 6900XT will suffer much more than the NVidia option.

but we must consider the extra vram, for higher res gaming

Link to comment
Share on other sites

Link to post
Share on other sites

do you care about RT performance? 6900XT's major advantage is low res gaming and it either matches or loses (badly) to the 3080 in many others. I'd get the 3080 regardless with a monitor like that.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Ankh Tech said:

but we must consider the extra vram, for higher res gaming

That's the worst problem with Big Navi, the 3080 beats 6900XT at 4K gaming despite the VRAM limitation.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Ankh Tech said:

but we must consider the extra vram, for higher res gaming

No we must not, as there are currently no games that are limited by 10GB. The NVidia cards are better across the board for high-res gaming.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Ankh Tech said:

but we must consider the extra vram, for higher res gaming

neh its not like 10gb of vram is not enough for 1440p or 4k

if it was useful give it a like :) btw if your into linux pay a visit here

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'd get the 3080 personally. AMD still has awful driver support, and while the 3080 isn't exactly sipping power, the 6900xt will guzzle much much more, creating more heat in the process. The 3080 will net you the same or better performance in most cases (with exceptions of course), and will do so with less power drawn and lower temps. Perhaps most importantly, it will do all this for $300 less.

 

Link to comment
Share on other sites

Link to post
Share on other sites

While I like rooting for the underdog, AMD is hardly the underdog anymore. They own the console market and they've got their boot on Intel's throat. Big Navi is AMD's Zen moment for sure, but I owned a Zen 1 processor when they launched and it was hot garbage, so not really a good selling point for Team Red imo. 

 

Based on reviews and benchmarks, Ampere is the safer option to go this time. Plus, the 6900xt is going to be as rare as hen's teeth for a long time. If you're concerned about power draw, Ampere can undervolt extremely well.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Currently, it seems that the best bang for your buck card is the 3080. If you are upgrading every time there's a new card, get 3080. However, if you are going to keep your card for a long time and would like to gamble, you might want to consider the 6000 series. If you are wondering what I mean about gambling(and this is only a theory), we all know that the current consoles are made from amd gpus and consoles normally last more than 5yrs. So, with this logic, this means that for the coming years, most games would be built and designed for these consoles, ergo, designed and built for amd gpus, be it with ray tracing or regular rasterization. But ya the safest bet is to get 3080, unless the 3080 20gb rumor is true.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, OrdinaryPhil said:

I'd get the 3080 personally. AMD still has awful driver support, and while the 3080 isn't exactly sipping power, the 6900xt will guzzle much much more, creating more heat in the process. The 3080 will net you the same or better performance in most cases (with exceptions of course), and will do so with less power drawn and lower temps. Perhaps most importantly, it will do all this for $300 less.

 

Think u need to do ur homework mate.

 

-13600kf 

- 4000 32gb ram 

-4070ti super duper 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, kitnoman said:

Currently, it seems that the best bang for your buck card is the 3080. If you are upgrading every time there's a new card, get 3080. However, if you are going to keep your card for a long time and would like to gamble, you might want to consider the 6000 series. If you are wondering what I mean about gambling(and this is only a theory), we all know that the current consoles are made from amd gpus and consoles normally last more than 5yrs. So, with this logic, this means that for the coming years, most games would be built and designed for these consoles, ergo, designed and built for amd gpus, be it with ray tracing or regular rasterization. But ya the safest bet is to get 3080, unless the 3080 20gb rumor is true.

Why do people say AMD cards ages better than Nvidia? I would assume the card that has better performance would age better (?) 

Ryzen 3900X | Aorus B550 | Gigabyte 1070Ti | Corsair 650 RMx | 32GB DDR4 | Define R5 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, dickjack said:

Why do people say AMD cards ages better than Nvidia? I would assume the card that has better performance would age better (?) 

Its a curious notion for sure. 

 

Back when AMD were overbuilding their hardware, it took months and sometimes years for the drivers to catch up, and so the fanboys fell back on the idea that AMD performance improved over time like fine wine, instead of acknowledging that AMD couldn't get their drivers optimized in a timely fashion for AAA games. 

 

The fine wine argument is a reaction to AMD having slow driver releases, because if AMD didn't have slow driver releases then there wouldn't be any "fine wine" improvements later. You can't have both, they're mutually exclusive.

 

The only other aspect is the amount of VRAM that AMD uses compared to Nvidia. Go all the way back to GCN 1.1 with the R9 290 and there were 4GB and 8GB variants, while Nvidia was playing around with 3GB 780s and 6GB Titans. As far back as I can remember, AMD have always had more VRAM. I think the VRAM size might be the only meaningful way that AMD cards could age better, but at some point all the VRAM in the world isn't going to give you more performance, and in my own testing Nvidia manages VRAM usage better than AMD does, which means AMD having more VRAM might simply be to compensate for less aggressive memory management. 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

I have the same problem deciding between the new RX 6900 XT and the RTX 3080.

 

I searched for some benchmarks and it seems like the rx 6900 xt is a little bit faster in most games.

 

However it lacks features like DLSS and proper Raytracing and other Nvidia only features.

 

My monitor is the LG 34GK950F since it is a freesync 2 monitor I thought it would be best to get an AMD card.

 

I don't really know which card I should get.

 

The CPU I'm going to use is Ryzen 7 5800X with an ASUS Crosshair VIII Formula and some 3800 cl14 Trident Z Neo Ram.

Link to comment
Share on other sites

Link to post
Share on other sites

So if you were able to get both (how the hell did you do that?!?!?!), I would keep the 3080 and return the 6900 XT (and pocket the $200-300 saved). The difference in performance is going to range from negligible to faster on the RTX 3080 (DLSS and/or RT). The 3080 is a safer bet as far as performance goes.

That being said, both are killer GPUs.

CPU: Ryzen 7 5800x3D || GPU: Gigabyte Windforce RTX 4090 || Memory: 32GB Corsair 3200mhz DDR4 || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G915 TKL || Mouse: Logitech G502 Lightspeed || PSU: EVGA 1300-watt G+ PSU || Case: Fractal Design Pop XL Air
 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Stahlmann said:

No we must not, as there are currently no games that are limited by 10GB. The NVidia cards are better across the board for high-res gaming.

yes, we definitely should. my GPU is already using almost 8gb playing MHW at 1440p /low settings... (btw the game estimated "only" 6.5GB with these settings ...) 

 

What the "no games use more than 8gb vram anyway lul" crowd doesn't understand, or refuses to acknowledge is this isn't about what a game *actually* uses at any given moment / frame, this is about how much a game fills actually up the vram, which can be easily measured with the standard monitoring softwares ala Afterburner,, GPU-Z, etc, because *that's* where issues like "hiccups", drops and "micro stutters" actually happen, and many games do fill up 8GB quite easily, the current limit is at around 11 GB from what I could gather. So the only way to avoid these issues is to have enough vram, and 8GB is laughably low and *not* enough, or lower settings, in which case why even bother buying an expensive "next gen" GPU. 

 

Anyways *on topic*... this is also kinda simple:

 

Do you care about recording videos with your GPU? Then choose Nvidia because they have the clear edge here. 

 

Do you not care about recording videos with your GPU? AMD is looking better here, especially because of the Vram amount that's not based on a standard from 2016 and should be a bit more future proof. 

 

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I used to be conflicted as well between 6000 series and 3000 series. if I have more time, I'll probably wait for amd's 7000 series, but I need a new gpu in the next few months and I've decided  to get either 3080 or 3070. The safest bet to get right now is the 3000 series as most current games favor their features or at least optimized. For older games that does not support dlss and ray tracing, amd are competitive, but with the current games that support dlss and ray tracing, they lose big time. However in the future, games would be built around their system. They actually made the best gpu move when they acquired the partnership to build the next gen consoles. They're train of thought is probably like this "if we are playing catch up with hardware, then lets get an advantage with the software(games) by having it optimized and build with our gpus." (again that is in the future but I think nvidia is about to "inteled" lol sooner or later) I'm really rooting for amd and hope that another gpu maker enters the market, but can't deny that 3000 series wins this generation just because they have a more mature features.

 

 

edit: but if you are a gambler and want to gamble you can go with amd, as I've mentioned(theory) most next gen games would be designed, optimized and build on and for ps5/xbox, ergo, on amd gpus. With the knowledge that they may or may not acquire with partnering with microsoft/sony, amd might have a better driver support this time. So yes, they might age well. focus on the word "might."

Edited by kitnoman
Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Longshot said:

.

it's a bit complicated this time around but i've narrowed it down to a 3080 ti or a 6800xt (AIB only), ill just list the reasons for each of the cards

 

3080 - the weakest one

6800 xt AIB capped at 2800, best deal at msrp (the asus 240 aio looks sick even for 850usd), rather get this over the 6900xt ref

6900 xt AIB possibly capped at 3000, fastest single card, but will they even have AIB cards b4 next gen is announced

3080 ti if i wanna avoid AMD afterall and go for RT/DLSS, rast performance is close enough to the 3090/6900xtconfirmed specs 2%~ slower than a 3090  (this is my most likely choice)

3090 makes no sense unless desperate

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, xg32 said:

it's a bit complicated this time around but i've narrowed it down to a 3080 ti or a 6800xt (AIB only), ill just list the reasons for each of the cards

 

3080 - the weakest one

6800 xt AIB capped at 2800, best deal at msrp (the asus 240 aio looks sick even for 850usd), rather get this over the 6900xt ref

6900 xt AIB possibly capped at 3000, fastest single card, but will they even have AIB cards b4 next gen is announced

3080 ti if i wanna avoid AMD afterall and go for RT/DLSS, rast performance is close enough to the 3090/6900xtconfirmed specs 2%~ slower than a 3090  (this is my most likely choice)

3090 makes no sense unless desperate

I am confused somewhat, what do you recommend?

Link to comment
Share on other sites

Link to post
Share on other sites

6800xt AIB model or the 3080 ti, since we can't be sure there will even be AIB models of 6900xt, it's heavily binned but the ref amd model gets beat by an aib 6800XT. 3080 (at 750~) isn't bad but it's slower than a 6800xt

58 minutes ago, Longshot said:

I am confused somewhat, what do you recommend?

 

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Pascal3366 said:

I have the same problem deciding between the new RX 6900 XT and the RTX 3080.

 

I searched for some benchmarks and it seems like the rx 6900 xt is a little bit faster in most games.

 

However it lacks features like DLSS and proper Raytracing and other Nvidia only features.

 

My monitor is the LG 34GK950F since it is a freesync 2 monitor I thought it would be best to get an AMD card.

 

I don't really know which card I should get.

 

The CPU I'm going to use is Ryzen 7 5800X with an ASUS Crosshair VIII Formula and some 3800 cl14 Trident Z Neo Ram.

You and I are in a very similar situation. I have a freesync monitor too and Ryzen 5800x. Honestly I was planning on getting AMD because of the freesync but I think I'm going to go with Nvidia due to what I'm looking for in games which are high on fidelity and I would honestly use DLSS and ray tracing on most AAA I play. I think that's what It comes down to and what you need to consider in the end well and the price, if the performance you get is worth the 300$ you could spend in like 4 years to get another amazing graphics card

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, MadPistol said:

So if you were able to get both (how the hell did you do that?!?!?!), I would keep the 3080 and return the 6900 XT (and pocket the $200-300 saved). The difference in performance is going to range from negligible to faster on the RTX 3080 (DLSS and/or RT). The 3080 is a safer bet as far as performance goes.

That being said, both are killer GPUs.

 

Ya that was my guess too, I was honestly hoping the 6900 xt would have a better rt support.

 

And I basically qued up in line for one then forgot about it and was prepared to just get the 6900 xt and qued up in line for it in another store. They basically called me both yesterday

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Longshot said:

You and I are in a very similar situation. I have a freesync monitor too and Ryzen 5800x. Honestly I was planning on getting AMD because of the freesync but I think I'm going to go with Nvidia due to what I'm looking for in games which are high on fidelity and I would honestly use DLSS and ray tracing on most AAA I play. I think that's what It comes down to and what you need to consider in the end well and the price, if the performance you get is worth the 300$ you could spend in like 4 years to get another amazing graphics card

Well honestly I don't care about the price as long as it is not exceeding 1000$ for a gpu.

 

The question is if the 300$ more will get me more performance to power my 144Hz monitor or if I should get the 3080 instead to have better features like dlss (I don't care at all about raytracing).

 

Honestly all I want is raw performance because I had enough of playing with 20-30fps in most games with my current R9 290x.

 

Since I have a 144Hz monitor I want at least 144 FPS at 3440x1440 with Ultra Settings. That's my target.

 

I am not sure if an RTX 3080 can deliver that experience and the RTX 3090 is too expensive.

 

So I am once again torn between the RX 6900 XT and the RTX 3080.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Pascal3366 said:

Well honestly I don't care about the price as long as it is not exceeding 1000$ for a gpu.

 

The question is if the 300$ more will get me more performance to power my 144Hz monitor or if I should get the 3080 instead to have better features like dlss (I don't care at all about raytracing).

 

Honestly all I want is raw performance because I had enough of playing with 20-30fps in most games with my current R9 290x.

 

Since I have a 144Hz monitor I want at least 144 FPS at 3440x1440 with Ultra Settings. That's my target.

 

I am not sure if an RTX 3080 can deliver that experience and the RTX 3090 is too expensive.

 

So I am once again torn between the RX 6900 XT and the RTX 3080.

 

 

 

if you dont care about raytracing keep the 6900xt, since in that term it is better, but you will lose on some godd features in the nvidia card, both wil give you what you need, and more

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×