Jump to content

6800XT or rtx 3080

raven123
11 minutes ago, papajo said:

That may have been partially my fault click on the link as apparently when I checked here on the post it doesnt zoom it further much https://i.imgur.com/4BZwBys.png

Still looks very similar tbh, especially considering that the aircraft for the 6800 XT test is lower in altitude compared to that of the 6800 XT

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Hymenopus_Coronatus said:

Still looks very similar tbh, especially considering that the aircraft for the 3080 test is lower in altitude compared to that of the 6800 XT

look around the river or highway(not sure what it is I think its a river though) in the sircle stretching alongside the middle of the circled areas its long and big you cant miss it then notice that the buildings are more chunkier/3D (and frankly it seems to have load more of them in the amd side) and on the RTX they are flatter and next to the lake there is a skyscraper with some smaller buildings (on the left of the circled areas) on the nvidia one it almost looks as a shadowy patch of forest because of the less detailed shadows and maybe because the 3D data did not load 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, papajo said:

next to the lake there is a skyscraper with some smaller buildings (on the left of the circled areas) on the nvidia one it almost looks as a shadowy patch of forest because of the less detailed shadows and maybe because the 3D data did not load 

That could also be due to the lower altitude. The sim sometimes drastically changes the form of buildings as you get closer to them, I've had things like that happen

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Hymenopus_Coronatus said:

The sim sometimes drastically changes the form of buildings as you get closer to them

That sounds like a vram limitation on your side 😛 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, papajo said:

That sounds like a vram limitation on your side

This happens at 1440p too, no VRAM limitation there. 

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

Wait on that image the 3080 test is on the right. I actually think the 3080 test looks better, the small houses actually look small, while on the AMD side it looks like the buildings are being "combined".

 

This could just be an issue with the autogen though

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Hymenopus_Coronatus said:

This happens at 1440p too, no VRAM limitation there. 

well I cant speak about that since I dont play the game as I said before but the post you initially replied to mentions that on 1080p it uses 7GB of VRAM so 1440p probably comes close to 10

 

  

On 1/21/2021 at 11:46 PM, Action_Johnson said:

I'd rather have 16GB of VRAM over DLSS or RT. I've seen videos where FS2020 uses about 24GB of system RAM and 10GB of VRAM when cruising around a dense spot on the planet at 4K. On my System I've seen FS2020 use 7GB of VRAM at 1080p and 22GB of system memory flying over spots like LA or Seattle. 

 

I've seen DLSS in person, I don't think it's really that impressive. It's an obvious visual downgrade over native res, and only exists because RT is simply too computationally expensive to run at native res, unless it's 1080P on a 3080. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, papajo said:

1080p it uses 7GB of VRAM so 1440p probably comes close to 10

Yeah idk how that happened. At 1440p it uses 6 GBs of VRAM (or less) for me and usually allocates like 7 or lower. So not even close to 10 GBs. 

 

This sim has too many variables to properly compare though, so it's hard to reach a conclusion without perfectly replicating the scenario, which is near impossible without a benchmark mode

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Hymenopus_Coronatus said:

Yeah idk how that happened. At 1440p it uses 6 GBs of VRAM (or less) for me and usually allocates like 7 or lower. So not even close to 10 GBs. 

 

It obviously depends on the area his 7GB load might be on a particular area. the vram usage is dynamic not static.  

 

Your "drastically change on buildings as you get closer" which sometimes happens as you said might be because at those times the Vram needs to be close or exceed 10GB yet the game engine decides to not got there because it cant and thus decides to load them while you are closer. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, papajo said:

Your "drastically change on buildings as you get closer"

That happens regardless of GPU though, it only loads photogrammetry data and some other higher-res imagery when you are close

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, papajo said:

It gives performance boost because it cant make use of the actual RTX gimmick/feature so it needs to degrade the image quality to an "unnoticeable" degree. 

 

and it also works with a limited list of titles. 

 

At this point whats the difference e.g compared to consoles using checkboard rendering to maintain framerate etc.. you are supposed to buy a PC in order to have better image quality 

DLSS 2.0 isn't as limited as 1.0 was on titles. Plus going forward more titles will support it than not.

Control - DLSS 2.0 ON vs OFF Comparison - 4k #RTXOn - YouTube

IMO the DLSS 2.0 either looks the same and in some cases better than being turned off. It also increased frame rates by almost 100% in some cases

As for RT power the 6800xt is on par with the 2080ti. Turn on DLSS though and even the 2080ti beats it slightly. The new cores on the 3080 are just much better at RT than the ones on the 6800xt.

As the the better image quality. If there is no discernable difference and you see a massive fps increase that would be better quality in general. We aren't talking talking about using inferior techniques to reduce gpu workload and create more performance at the cost of visual fidelity. We are talking about using an AI engine to leverage TAA and produce a better image at a lower performance cost. I mean AMD is trying to come up with their own solution for it now since the Nvidia implementation has been such a huge success.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, AngryBeaver said:

I had the choice between a 6800xt and a RTX3080. I went with the 3080 for several reasons.

DLSS - This is a huge and pretty much free performance boost. It often produces much better results when it comes to quality than other options and does it while giving a performance boost.

Raytracing - New titles are implementing it and with the new gen of consoles supporting it we will see much more going forward. It makes a huge difference in games that utilized it and the 3080 has MUCH better RT performance.

GDDR6x - It is MUCH faster. Yes, you have less vram on the 3080, but when you have enough speed getting stuff into the memory isn't a problem. This is going to improve greatly in a month or so when the resizable bar support hits for Nvidia.

Drivers - I have encounter driver issues with almost every single AMD card I have ever owned. I can count on one hand the amount of times Nvidia cards have giving me issues and normally it was something that was quickly fixed via beta drivers or a new hotfix release.

When it comes to rasterization sure the 6800xt appears to have a slight edge, but at the fps these cards are pushing for pure raster content it isn't as big of a plus as it use to be. When you look at it you have to look at the whole picture and for me you get a lot of great stuff for the 50 dollar difference between the two cards.

Drivers:

Nvidia had more driver problems at start than AMD had on this generation. AMD also has better game optimization or atleast good one. since rx 580 once was little worse than gtx 1060, now it is better by noticable amount

 

GDDR6X: That is why 3080 is better than 6800XT at 4k or up. but 6800XT is just little better than 3080 at 1080p and 1440p since the cores clock much higher.

 

QUOTE ME  FOR ANSWER.

 

Main PC:

Spoiler

|Ryzen 7 3700x, OC to 4.2ghz @1.3V, 67C, or 4.4ghz @1.456V, 87C || Asus strix 5700 XT, +50 core, +50 memory, +50 power (not a great overclocker) || Asus Strix b550-A || G.skill trident Z Neo rgb 32gb 3600mhz cl16-19-19-19-39, oc to 3733mhz with the same timings || Cooler Master ml360 RGB AIO || Phanteks P500A Digital || Thermaltake ToughPower grand RGB750w 80+gold || Samsung 850 250gb and Adata SX 6000 Lite 500gb || Toshiba 5400rpm 1tb || Asus Rog Theta 7.1 || Asus Rog claymore || Asus Gladius 2 origin gaming mouse || Monitor 1 Asus 1080p 144hz || Monitor 2 AOC 1080p 75hz || 

Test Rig.

Spoiler

Ryzen 5 3400G || Gigabyte b450 S2H || Hyper X fury 2x4gb 2666mhz cl 16 ||Stock cooler || Antec NX100 || Silverstone essential 400w || Transgend SSD 220s 480gb ||

Just Sold

Spoiler

| i3 9100F || Msi Gaming X gtx 1050 TI || MSI Z390 A-Pro || Kingston 1x16gb 2400mhz cl17 || Stock cooler || Kolink Horizon RGB || Corsair CV 550w || Pny CS900 120gb ||

 

Tier lists for building a PC.

 

Motherboard tier list. Tier A for overclocking 5950x. Tier B for overclocking 5900x, Tier C for overclocking 5800X. Tier D for overclocking 5600X. Tier F for 4/6 core Cpus at stock. Tier E avoid.

(Also case airflow matter or if you are using Downcraft air cooler)

Spoiler

 

Gpu tier list. Rtx 3000 and RX 6000 not included since not so many reviews. Tier S for Water cooling. Tier A and B for overcloking. Tier C stock and Tier D avoid.

( You can overclock Tier C just fine, but it can get very loud, that is why it is not recommended for overclocking, same with tier D)

Spoiler

 

Psu tier List. Tier A for Rtx 3000, Vega and RX 6000. Tier B For anything else. Tier C cheap/IGPU. Tier D and E avoid.

(RTX 3000/ RX 6000 Might run just fine with higher wattage tier B unit, Rtx 3070 runs fine with tier B units)

Spoiler

 

Cpu cooler tier list. Tier 1&2 for power hungry Cpus with Overclock. Tier 3&4 for overclocking Ryzen 3,5,7 or lower power Intel Cpus. Tier 5 for overclocking low end Cpus or 4/6 core Ryzen. Tier 6&7 for stock. Tier 8&9 Ryzen stock cooler performance. Do not waste your money!

Spoiler

 

Storage tier List. Tier A for Moving files/  OS. Tier B for OS/Games. Tier C for games. Tier D budget Pcs. Tier E if on sale not the worst but not good.

(With a grain of salt, I use tier C for OS myself)

Spoiler

 

Case Tier List. Work In Progress. Most Phanteks airflow series cases already done!

Ask me anything :)

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, SavageNeo said:

Drivers:

Nvidia had more driver problems at start than AMD had on this generation. AMD also has better game optimization or atleast good one. since rx 580 once was little worse than gtx 1060, now it is better by noticable amount

 

GDDR6X: That is why 3080 is better than 6800XT at 4k or up. but 6800XT is just little better than 3080 at 1080p and 1440p since the cores clock much higher.

 

Speaking of clocks. My lowly little 3080 was having issues getting to 2k mhz on air, but now after slapping on a waterblock I am seeing 2200mhz from just boosting and an undervolt. My card is stuck with a stupid 320w plimit so it is my limiting factor atm until I find a bios that can bypass it.

Link to comment
Share on other sites

Link to post
Share on other sites

For reference, I boosted the details on FS2020 last night on my 3080 and ran it at 4K. I went and flew around Los Angeles for a while, and VRAM allocation (not utilization) hovered around 9.5GB, bouncing up and down as I flew around. The framerate was 30-40ish, but it was very smooth. I could crank the details higher (mostly resolution scaling), but there's no point as I will be pushing into unplayable territory pretty quickly.

 

If FS2020 is the most VRAM intensive scenario, why is it an issue?

This is a catch-22.

People saying that 16GB VRAM on the RX 6800(XT)/6900 XT is necessary for the future are already lost because the cards clearly begin to suffer around 4K. However, the 3080, which only has 10GB VRAM is faster at 4K, and pulls way out ahead once you enable RT or DLSS.

 

So the card with LESS vram is a better 4K card than the cards with more vram. Clearly, Nvidia knew what they were doing with the RTX 3080, whether people are willing to admit it or not.

CPU: Ryzen 7 5800x3D || GPU: Gigabyte Windforce RTX 4090 || Memory: 32GB Corsair 3200mhz DDR4 || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G915 TKL || Mouse: Logitech G502 Lightspeed || PSU: EVGA 1300-watt G+ PSU || Case: Fractal Design Pop XL Air
 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, MadPistol said:

For reference, I boosted the details on FS2020 last night on my 3080 and ran it at 4K. I went and flew around Los Angeles for a while, and VRAM allocation (not utilization) hovered around 9.5GB, bouncing up and down as I flew around. The framerate was 30-40ish, but it was very smooth. I could crank the details higher (mostly resolution scaling), but there's no point as I will be pushing into unplayable territory pretty quickly.

 

If FS2020 is the most VRAM intensive scenario, why is it an issue?

This is a catch-22.

People saying that 16GB VRAM on the RX 6800(XT)/6900 XT is necessary for the future are already lost because the cards clearly begin to suffer around 4K. However, the 3080, which only has 10GB VRAM is faster at 4K, and pulls way out ahead once you enable RT or DLSS.

 

So the card with LESS vram is a better 4K card than the cards with more vram. Clearly, Nvidia knew what they were doing with the RTX 3080, whether people are willing to admit it or not.

Ya the funniest argument is the 6800.  It really just is nowhere close to being able to push the frames at 4k to ever need 16gb VRAM.  

 

Do people not remember the Radeon VII?  It had all that tasty VRAM and no one cared because it couldnt actually push frames that mattered anywhere near it.  This is literally the same scenario (at least with the 6800), although the 6800XT is starting to approach the ability to push out frames at a high enough rate in titles using high VRAM.  But it still cant push playable framerates on titles requiring >10 gb.   Who cares if you can "play" a game requiring 15 gb VRAM if your card can only push 5 FPS in that game?

 

And importantly, we are always held back by many AAA games being optimized for consoles.  Consoles have a GPU that has less horsepower than a 2080.  They arent going to be making games this gen that next gen consoles have no chance at playing with decent textures or FPS.  Cyberpunk is already pushing next gen consoles and I am not hitting 10gb on it.   The 3080 is just an all around better buy due to its feature set.  Priced at MSRP, it would be silly to take a 6800XT over a 3080 unless you are playing in 1080p.

El Zoido:  9900k + RTX 4090 / 32 gb 3600mHz RAM / z390 Aorus Master 

 

The Box:  3900x + RTX 3080 /  32 gb 3000mHz RAM / B550 MSI mortar 

Link to comment
Share on other sites

Link to post
Share on other sites

Well i have bouth ASUS tuf 3080 OC and msi gaming x trio 6800xt, first for me rtx its not important  and on newest games the rtx diference its not much the 3080 win  iswith dlss só when amd manage the fidelityfx the Gap Will close a bit more, and maybe with consoles beeing AMD that is a plus at least for me. On the games i play the 6800xt does better job @ 1440p.

 

I play apex/ cs go as competitive and i do like Open world games.. but for now i ONLY had time to test Horizon zero dawn on bouth cards.. and for competitive settings on apex and cs go 6800xt is way better. Um Horizon i say they ar equals. 

 

The biggest diference for me was on apex the 3080 manage between 210 and 260 fps And the  6800xt hits 300 and most of time handles 300fps and never Saw 99% usuage on the GPU, the 3080 did good job Also but 6800xt its better. 

 

i leave a screenshot of apex (competitive settings) só you can check:unknown-4.thumb.png.3e018ee9eee51fdb2b9630a94c68e0e2.png

 

Playing @ 1440p with 6800xt msi gaming x trio.

Ryzen 7 5800x that was undervolted and at 4300mhz (pointless use more of that for now) with 32gb RAM 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Zberg said:

Ya the funniest argument is the 6800.  It really just is nowhere close to being able to push the frames at 4k to ever need 16gb VRAM.  

 

Do people not remember the Radeon VII?  It had all that tasty VRAM and no one cared because it couldnt actually push frames that mattered anywhere near it.  This is literally the same scenario (at least with the 6800), although the 6800XT is starting to approach the ability to push out frames at a high enough rate in titles using high VRAM.  But it still cant push playable framerates on titles requiring >10 gb.   Who cares if you can "play" a game requiring 15 gb VRAM if your card can only push 5 FPS in that game?

 

And importantly, we are always held back by many AAA games being optimized for consoles.  Consoles have a GPU that has less horsepower than a 2080.  They arent going to be making games this gen that next gen consoles have no chance at playing with decent textures or FPS.  Cyberpunk is already pushing next gen consoles and I am not hitting 10gb on it.   The 3080 is just an all around better buy due to its feature set.  Priced at MSRP, it would be silly to take a 6800XT over a 3080 unless you are playing in 1080p.

Well i say they are equals, and ye 3080 wins in some games and lose on others so, beeing neural and owner of bouth cards i say they are equals. 

 

But ,as for memory...  its true you didnt Saw 3080 hitting 10gb buut hits 8-8.5gb sometimes almost 9gb VRAM usuage. And tha its now.... on One year games Will hit 10 easly so ONLY then we Will see if ir needs more of not.. my Guess its that 3080 Will fall out on future, Will be allways a great but not good as 6800xt.

 

But thinking on future i look on vanlhala ,bf5 and godfall per example and 6800xt performs better, so its a new product and games  Will Start to be optimize for it, ONLY in a few months we Will be able to say wich One its better,  but since its cheaper and Also hás less tdp 6800xt its a better buy..

 

Anyway for now any of tem Will make you happy

Link to comment
Share on other sites

Link to post
Share on other sites

3080 if you care about ray tracing.

 

Also, in my experience price-wise the 3080 is more often cheaper than the 6800xt, and is closer in price to the 6800 non xt.

 

So imo the comparison should be against the 6800 dollar for dollar.

 

3080 is a little slower than the 6800xr, but costs a hundred or more less.

 

You can still get $699 FE 3080s.....6800xt is very difficult to get $649 Ref cards, and you are more likely going to have to pay $800+ for an AIB.

 

I paid $750 msrp for my Asus tuf 3080 oc, which is generally cheaper than most 6800xt sibs and costs closer to aib 6800 nonxt.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

No brainer, DLSS and RT are features that puts it way past anything the RX6800XT has to offer.

In terms of rasterization performance, both trade blows, with a slight advantage to the 6800XT at 1080p to 1440p but at 4k its the other way around (funny how with lower VRAM at 4K 3080 is faster).

 

Edit:  Also some users here can't tell the difference between VRAM allocation and actual usage (not surprising). Softwares like Afterburner and Precision are reporting VRAM allocation NOT usage.  10GB of GDDR6X is more than enough for 4K.

Link to comment
Share on other sites

Link to post
Share on other sites

I have a 3080 and personally I wish that I had gotten a 6800XT. That being said if you have the option for either you're in a special spot as I only got mine when I did because I could.

 

I think it really depends on whether your focus is on AAA games or indie games. AAA games: 3080 for DLSS and ray tracing. Indie games: 6800XT for better thermals (they need less power) and potentially better performance. With the 3080 you can't really overclock it above 2100Mhz unless you feed it more power, which makes it hotter, which could end up with it throttling more. Compare that to the 5700 I recently sold. On that card I flashed it to 5700XT bios, maxed out power, undervolted it, AND overclocked it. You can't do that kind of thing with the 3080 and IIRC you still get those kinds of options with the 6000 line of cards.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, WakaDooda said:

I have a 3080 and personally I wish that I had gotten a 6800XT. That being said if you have the option for either you're in a special spot as I only got mine when I did because I could.

 

I think it really depends on whether your focus is on AAA games or indie games. AAA games: 3080 for DLSS and ray tracing. Indie games: 6800XT for better thermals (they need less power) and potentially better performance. With the 3080 you can't really overclock it above 2100Mhz unless you feed it more power, which makes it hotter, which could end up with it throttling more. Compare that to the 5700 I recently sold. On that card I flashed it to 5700XT bios, maxed out power, undervolted it, AND overclocked it. You can't do that kind of thing with the 3080 and IIRC you still get those kinds of options with the 6000 line of cards.

I had before a 5700xt i bought it when everyone was saying to get the 2070/2070 super  instead.. and Guess what..i  was pretty happy with my old 5700xt..  never had a single issue with it...  And for those who say AMD drivers suck.... well maybe try out first 😉

 

 

I own a 3080 and a 6800xt and i say with same cpu (R7 5800x @ 4.7ghz,  1.275v) the ONLY  advantage of 3080 for now its on Ray tracing and dlss together...Im not a Ray tracing guy.. i think Ray tracing  its pointless for now... Even the most recent games dont looks that awsome.. to make me preffer 144+ fps to 70 with  raytracing.. 

Anyway about dlss it looks good, but a raytracing  high settings  without dlss looks way better than ultra with dlss and barely gives same amount of FPS so i dont understand why people talk só much about dlss...  you got a 3080 do this test yourself and tell me what you think.

 

"Yeee i do get 90 fps with raytracing at ultra!!!! ' but looks worst than even medium Ray tracing settings... Thats a pointless feature for what iv seen só far 

 

i bought bouth cards had that luck to find them in stock. Im on vacation this week and well yeaterday i did some more tests on games and i decide to sell my 3080 and keep 6800xt...  My NVIDIA fanboy time died on my old 970. 

 

btw quick tip, SAM  does work very well , Many games you get Over 20 fps other up to 5 depends of the game, but the minimum framerates increase alot on most of the games.  

 

Link to comment
Share on other sites

Link to post
Share on other sites

Anyway take attention if they fit your case. They are big cards.. I leave pics of bouth 

IMG-20201222-WA0009.jpeg

IMG_20210120_202431.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Nets said:

I had before a 5700xt i bought it when everyone was saying to get the 2070/2070 super  instead.. and Guess what..i  was pretty happy with my old 5700xt..  never had a single issue with it...  And for those who say AMD drivers suck.... well maybe try out first 😉

 

 

I own a 3080 and a 6800xt and i say with same cpu (R7 5800x @ 4.7ghz,  1.275v) the ONLY  advantage of 3080 for now its on Ray tracing and dlss together...Im not a Ray tracing guy.. i think Ray tracing  its pointless for now... Even the most recent games dont looks that awsome.. to make me preffer 144+ fps to 70 with  raytracing.. 

Anyway about dlss it looks good, but a raytracing  high settings  without dlss looks way better than ultra with dlss and barely gives same amount of FPS so i dont understand why people talk só much about dlss...  you got a 3080 do this test yourself and tell me what you think.

 

"Yeee i do get 90 fps with raytracing at ultra!!!! ' but looks worst than even medium Ray tracing settings... Thats a pointless feature for what iv seen só far 

 

i bought bouth cards had that luck to find them in stock. Im on vacation this week and well yeaterday i did some more tests on games and i decide to sell my 3080 and keep 6800xt...  My NVIDIA fanboy time died on my old 970. 

 

btw quick tip, SAM  does work very well , Many games you get Over 20 fps other up to 5 depends of the game, but the minimum framerates increase alot on most of the games.  

 

I've said many times that the 6800 XT is the superior card if you game at 1080p or 1440p and don't plan to use games with RT or DLSS. I'm one of the few people that likes RT/DLSS and has a 4K120 display. Because of this, the RTX 3080 is a better fit for me, but for MOST people, I will agree that the 6800 XT is the better solution for now.

CPU: Ryzen 7 5800x3D || GPU: Gigabyte Windforce RTX 4090 || Memory: 32GB Corsair 3200mhz DDR4 || Motherboard: MSI B450 Tomahawk || SSD1: 500 GB Samsung 850 EVO M.2 (OS drive) || SSD2: 500 GB Samsung 860 EVO SATA (Cache Drive via PrimoCache) || Spinning Disks: 3 x 4TB Western Digital Blue HDD (RAID 0) || Monitor: LG CX 55" OLED TV || Sound: Schiit Stack (Modi 2/Magni 3) - Sennheiser HD 598, HiFiMan HE 400i || Keyboard: Logitech G915 TKL || Mouse: Logitech G502 Lightspeed || PSU: EVGA 1300-watt G+ PSU || Case: Fractal Design Pop XL Air
 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×