Jump to content

AMD did NOT disappoint me

3 hours ago, ewitte said:

That entirely depends on the title.  I believe it was minecraft RTX but the #s from Gamers nexus stood out.  AMD was getting 27fps vs about 35fps on Nvidia however it was 71fps with DLSS enabled and in that title the visual difference was indiscernible.   Yes I agree it would be nice if it was implemented more and AMD hasn't released their solution yet.

I was talking about the technology it self .

 

But that game you mentioned is like the only most buffed up RTX tm (but still simple and I wonder why they picked minecraft 😙) one of the few that even support some sort of ray tracing its a very limited list maybe even smaller than the one of games supporting SLi...  

 

And that's my point if they want to bring raytracing let them bring real ray tracing not partially ray tracing with image quality degradation to an "unnoticeable" degree (sound pretty much like "your eyes can only see 30 FPS" console gamers used to say ) to have a decent FPS

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, bit said:

did i watch a different video from everyone else? cus this is exactly what i got from the ltt video. with the caveat at the end that if things like rt performance, dlss and things like nvenc are more important to you then to stick with nvidia. 

Maybe you watched Jays or GN by accident?

 

2:30 in the video Linus made bold claims stating nvidia had no chance against AMD even tho the chart clearly displays a lackluster lead even with SAM enabled. All GPUs did about 90fps+ in 1440 even out of the box and only the 3070 didn't break 120 FPS (unless you look at 95% at which time only SAM and DLSS did). Therefore conclusion is nvidia is at par with AMD with or without DLSS/SAM enabled.

 

2:45 Linus goes on stating with RT enabled performance suffers tremendously, which it does, however (and it's a big however) this is AMDs first go at it where as this is nvidias 2nd/3rd go at it and failed to give credit where credit is due like others have. Equally if you can game in 4K 30FPS with DXR on  it's only, ONLY,  8FPS less than a 3080 on their first attempt, w/o DLSS enabled. I have my doubts last years cards could pull off AMDs performance. 13 FPS difference is damn good at first shot as 79 FPS is very much playable. Even at 4K the FPS given is still playable for the XT, and is even less of a gap at 7 FPS.

 

3:42 is there any point in displaying CS:GO anymore most cards will exceed 200 FPS now so it's basically a pen size competition to see who's bigger. LTT is the only one of the 3 still doing it. They need to stop, it's a lazy mans way of filling up a video. Also why the hell is the 6800 getting more FPS than the non SAM 6800 XT?

 

4:05 Minecraft RT Linus claims it is a mess, it really isn't, 35 FPS for a first attempt with a low of 29 is excellent esp when you look at the fact nvidias 3000 series could in fact dip below that based on their chart, the 6800 XTs RT is almost/is (as this test was done in 1440, the 2000s where done in 1080) equal to nvidias first attempt with the 2000 series. Is it better on nvidia? Yes, should you trust it? Hell NO. Should you buy a card purely for RT? No, but no one can stop you either. Also someone at LTT doesn't know maths clearly, as half of 64 is 32, not 35. (to make things worst they emphasized it so it was NO mistake)

 

5:00 Linus openly admits he basically screwed up just a minute ago, which means the test shouldn't have even been done. Also 35FPS avg is playable, even 29 is, annoying but playable.

 

5:24 uh did Linus even double check his facts based on his past videos before he spoke? Short answer nope, long answer:

https://www.youtube.com/watch?v=3bmQPx9EJLA 5700 XT review from 2019

The 5700 XT did BMW in 1:45 vs 0:50 for the 2070S and did 3:29 in Classroom vs 2:45, where as the 6800 XT did BMW in 0:38 seconds vs the 0:27 with the 3080 and out preformed the 3080 in the Classroom by 12 seconds. In other words AMD shaved a whopping 1:07 off the time where as nvidia only did a pathetic 12 seconds.

 

6:29 Not a issue about the video (I don't think), but for whatever reason 3DS Max the 3080 did far worst than the 2070S in the video above unless they changed their scaling.

 

8:18 hates on the fact it's AMDs first attempt instead of giving credit also admits it's last gen but ignores the previously stated. There is always one horse out of the gate first, doesn't mean it will be the first to cross the finish. Start hating in attempt 2 or 3 if they are still lagging, not their first. It's the equivalent of hating the big automotive industry for not being the first out with modern EVs (GM had the title, but they don't count as they killed it) and saying Teslas are obviously better because they are the first and not because of the years underneath them even tho there are a few upcoming units that could compete with them very soon on level playing grounds. So once again lack of credit where credit is due.

Equally RT isn't becoming more important day by day, the only major title to come out with RT support right now is CP2077. It'll become important when we can hit 60FPS w/o DLSS or alike, anything less and you might as well not even bother unless you hate your eyes. Or is there an AI card that is able to reproduce its cores inside a computer giving you a day by day boost of performance w/o my knowing?

 

RT (today), DLSS (everyday, less on a strict budget) should not be a consideration in buying a card. The only factors I will agree with are drivers (as mentioned above in previous posts) and the lack of in house recording/streaming. Over all the 6800 XT is basically the 3080's equal, and in 6 months time could become just as good in RT as it:

image.thumb.png.b18c664621319956deac94cf8c602df5.png

 

All Linus did there was went straight for the jugular basically ignoring the bulk of the letter just to complain about DLSS and how AMD doesn't have theirs ready. So what? You SHOULDN'T be relying on it. Also I agree with AMDs statement, just watch Jays video about the 6800 XT where he uses some stupid oversized monitor.

 

9:12 Linus becomes a hypocrite

9:17 Where did AMD ever despise? I agree with them DLSS is useless, it works but shouldn't be relied on. That is why they are working on it but are not intentionally wasting tons of resources on it, because it's useless. I rather see them put the time in improving RT and Drivers along with a more feature rich experience over all not some stupid AI trick.

10:45 He's not wrong, it just makes all of his bashing and hating useless. I'm happy for AMD so should everyone as it means next gen cards are going to be awesome on both teams and that I for one can't wait for.

10:55 Are you sure? You sounded very disappointed threw out the whole video.

11:15 Not really, if you ignore DLSS (which you should) and RT (which you should, for now), less the few lack luster things like Driver support and such they are inline with the cards they are competing against. However there is a reason why the XT is $50 less.

12:05 anyone who wanted and expected it are fools.

 

Days end nvidia for streamers content creators etc is still the go to for now. For everyone else like the average gamer AMD is just as good so long you don't care about RT and the shitty DLSS marketing gimmick.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Egg-Roll said:

Maybe you watched Jays or GN by accident?

 

2:30 in the video Linus made bold claims stating nvidia had no chance against AMD even tho the chart clearly displays a lackluster lead even with SAM enabled. All GPUs did about 90fps+ in 1440 even out of the box and only the 3070 didn't break 120 FPS (unless you look at 95% at which time only SAM and DLSS did). Therefore conclusion is nvidia is at par with AMD with or without DLSS/SAM enabled.

 

2:45 Linus goes on stating with RT enabled performance suffers tremendously, which it does, however (and it's a big however) this is AMDs first go at it where as this is nvidias 2nd/3rd go at it and failed to give credit where credit is due like others have. Equally if you can game in 4K 30FPS with DXR on  it's only, ONLY,  8FPS less than a 3080 on their first attempt, w/o DLSS enabled. I have my doubts last years cards could pull off AMDs performance. 13 FPS difference is damn good at first shot as 79 FPS is very much playable. Even at 4K the FPS given is still playable for the XT, and is even less of a gap at 7 FPS.

 

3:42 is there any point in displaying CS:GO anymore most cards will exceed 200 FPS now so it's basically a pen size competition to see who's bigger. LTT is the only one of the 3 still doing it. They need to stop, it's a lazy mans way of filling up a video. Also why the hell is the 6800 getting more FPS than the non SAM 6800 XT?

 

4:05 Minecraft RT Linus claims it is a mess, it really isn't, 35 FPS for a first attempt with a low of 29 is excellent esp when you look at the fact nvidias 3000 series could in fact dip below that based on their chart, the 6800 XTs RT is almost/is (as this test was done in 1440, the 2000s where done in 1080) equal to nvidias first attempt with the 2000 series. Is it better on nvidia? Yes, should you trust it? Hell NO. Should you buy a card purely for RT? No, but no one can stop you either. Also someone at LTT doesn't know maths clearly, as half of 64 is 32, not 35. (to make things worst they emphasized it so it was NO mistake)

 

5:00 Linus openly admits he basically screwed up just a minute ago, which means the test shouldn't have even been done. Also 35FPS avg is playable, even 29 is, annoying but playable.

 

5:24 uh did Linus even double check his facts based on his past videos before he spoke? Short answer nope, long answer:

https://www.youtube.com/watch?v=3bmQPx9EJLA 5700 XT review from 2019

The 5700 XT did BMW in 1:45 vs 0:50 for the 2070S and did 3:29 in Classroom vs 2:45, where as the 6800 XT did BMW in 0:38 seconds vs the 0:27 with the 3080 and out preformed the 3080 in the Classroom by 12 seconds. In other words AMD shaved a whopping 1:07 off the time where as nvidia only did a pathetic 12 seconds.

 

6:29 Not a issue about the video (I don't think), but for whatever reason 3DS Max the 3080 did far worst than the 2070S in the video above unless they changed their scaling.

 

8:18 hates on the fact it's AMDs first attempt instead of giving credit also admits it's last gen but ignores the previously stated. There is always one horse out of the gate first, doesn't mean it will be the first to cross the finish. Start hating in attempt 2 or 3 if they are still lagging, not their first. It's the equivalent of hating the big automotive industry for not being the first out with modern EVs (GM had the title, but they don't count as they killed it) and saying Teslas are obviously better because they are the first and not because of the years underneath them even tho there are a few upcoming units that could compete with them very soon on level playing grounds. So once again lack of credit where credit is due.

Equally RT isn't becoming more important day by day, the only major title to come out with RT support right now is CP2077. It'll become important when we can hit 60FPS w/o DLSS or alike, anything less and you might as well not even bother unless you hate your eyes. Or is there an AI card that is able to reproduce its cores inside a computer giving you a day by day boost of performance w/o my knowing?

 

RT (today), DLSS (everyday, less on a strict budget) should not be a consideration in buying a card. The only factors I will agree with are drivers (as mentioned above in previous posts) and the lack of in house recording/streaming. Over all the 6800 XT is basically the 3080's equal, and in 6 months time could become just as good in RT as it:

image.thumb.png.b18c664621319956deac94cf8c602df5.png

 

All Linus did there was went straight for the jugular basically ignoring the bulk of the letter just to complain about DLSS and how AMD doesn't have theirs ready. So what? You SHOULDN'T be relying on it. Also I agree with AMDs statement, just watch Jays video about the 6800 XT where he uses some stupid oversized monitor.

 

9:12 Linus becomes a hypocrite

9:17 Where did AMD ever despise? I agree with them DLSS is useless, it works but shouldn't be relied on. That is why they are working on it but are not intentionally wasting tons of resources on it, because it's useless. I rather see them put the time in improving RT and Drivers along with a more feature rich experience over all not some stupid AI trick.

10:45 He's not wrong, it just makes all of his bashing and hating useless. I'm happy for AMD so should everyone as it means next gen cards are going to be awesome on both teams and that I for one can't wait for.

10:55 Are you sure? You sounded very disappointed threw out the whole video.

11:15 Not really, if you ignore DLSS (which you should) and RT (which you should, for now), less the few lack luster things like Driver support and such they are inline with the cards they are competing against. However there is a reason why the XT is $50 less.

12:05 anyone who wanted and expected it are fools.

 

Days end nvidia for streamers content creators etc is still the go to for now. For everyone else like the average gamer AMD is just as good so long you don't care about RT and the shitty DLSS marketing gimmick.

yea not to mention that they decided to fuzz the graphs up so that "no clear picture" can be seen with introducing  mixed lines of products and settings on the Y axis (yes obviously one can derive info from the graphs especially if he knows what all these name are about and if he pauses the video but lets not kid ourselves it's always about shaping the opinion of the common denominator and everything lower than that that's how ads work so a squeaky child with the attention spawn of a couple of seconds wouldnt derive much meaning out of it just see its not as impressive since all graphs are close in size and asks a nvidia card from his dad for Christmas) 

 

On top of that I believe they intentionally used less ram to minimize performance increases with SAM and also instead of showing normal gaming benches (I will refrain of the marketing lingo calling them "rasterized benchmarks" since almost all games people care about are rendered that way rtx is at this point in time a gimmick with limited supported game list not worthy to separate a category of its own) in a sequence and then RTX benchmarks they do one normal game bench and the next graph is RTX bench then 1 normal bench then RTX bench just to diminish the effect/impression of people when seeing high FPS to see immediately low FPS after I mean noone did that ever I really think they would like to make nvidia shine is a good of a light as possible in this video despite it being supposedly about the RX 6000 series... 

 

+ even the normal game benchmarks seem kind of lower compared to most other benchmarks out there (e.g hardware unboxed, gamers nexus, hardware canucks etc) 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, papajo said:

yea not to mention that they decided to fuzz the graphs up so that "no clear picture" can be seen with introducing  mixed lines of products and settings on the Y axis (yes obviously one can derive info from the graphs especially if he knows what all these name are about and if he pauses the video but lets not kid ourselves it's always about shaping the opinion of the common denominator and everything lower than that that's how ads work so a squeaky child with the attention spawn of a couple of seconds wouldnt derive much meaning out of it just see its not as impressive since all graphs are close in size and asks a nvidia card from his dad for Christmas) 

 

On top of that I believe they intentionally used less ram to minimize performance increases with SAM and also instead of showing normal gaming benches (I will refrain of the marketing lingo calling them "rasterized benchmarks" since almost all games people care about are rendered that way rtx is at this point in time a gimmick with limited supported game list not worthy to separate a category of its own) in a sequence and then RTX benchmarks they do one normal game bench and the next graph is RTX bench then 1 normal bench then RTX bench just to diminish the effect/impression of people when seeing high FPS to see immediately low FPS after I mean noone did that ever I really think they would like to make nvidia shine is a good of a light as possible in this video despite it being supposedly about the RX 6000 series... 

 

+ even the normal game benchmarks seem kind of lower compared to most other benchmarks out there (e.g hardware unboxed, gamers nexus, hardware canucks etc) 

Their graph isn't really bad it's just wrongly organized, either 6800 XT V 3080, 6800 V 3070 etc or 6800, 6800 XT V 3080, 3070. How the video has it has this massive gap so you have to look down the list which for someone who doesn't pause might think SAM could be nvidia due to no time to double check w/o pausing. That is one thing GN does right with a countdown and one long enough to absorb the information in. Their chart if they want it to be in such a setup should be done vertical like how the big players aka AMD etc do it, your eyes can safely gather all information needed fast, because the bottom has all words then looking up in the chart you can logically understand, and using chart breaks to remove gaps from 0-xxxx would make more sense then doing it across the screen forcing users to whip their head/eyes back and force trying to gather the data.

 

Ram isn't really cheap, and the last thing AMD wants is nvidia to throw out the Ti/Supers tomorrow or in 6 months time (which they likely could), it's actually smart for AMD as their cards are cheaper to make now which means more money, more money equals more development, and longer life cycle because nvidia won't feel the need to toss new cards at the market. Nvidia unlike Intel has the ability to face AMD head on right now, but AMD is being smart about it by just matching them using the extra income to hopefully slip ahead in the net gen or 2 of cards. I do agree with you on RT tho the first benchmark (SotTR) while not as good as the 3080 the 6800 XT is still in perfectly playable condition w/o SAM and one could likely OC it to bump the FPS a little maybe even push it to 90, however out of the box 100% playable.

 

It's not just the charts that hurt the card in the video but also the words, the charts for a person trying to match word to image is a disaster so charts are not much of an issue unless they focus on the worst bar and best bar, this whole video feels like it was designed to tear apart the 6800 XT while promoting the 3080 as the 6800 was only mentioned once during the whole benchmark process and to only compare to the 3070 which can be easily assumed they threw the 6800 in to make the 6800 XT look worst on the fast flipping charts.

 

Not sure, but I suspect they where not using up to spec hardware or did something to it maybe using a random Intel processor?, they obviously had to use the Ryzen 5000 chip on premise for SAM but they never spec'd their setup which is rather suspicious if you ask me, esp considering they normally would.

 

This feels like a little too real now:

 

Also a quick scroll down on the YT video, we aren't the only ones having a issue with Linus's words.

Link to comment
Share on other sites

Link to post
Share on other sites

13 minutes ago, Egg-Roll said:

Their graph isn't really bad it's just wrongly organized,

That was my point. (in other words I believe it was intentionally so because it doesnt need much prep or post editing to put the names in a better order especially from people that do that job for a few years and done it right in the past) 

 

13 minutes ago, Egg-Roll said:

Ram isn't really cheap

It has gotten cheaper although I agree that PC parts have ridiculous pricetags nowadays (all a result of "free" youtube lip service and passive aggressive, and sometimes just aggressive, promotion from youtubechannels who instead of being tech journalists supporting the interest of the consumer suck upon the corporate tit to get that free gear or this pieces of hardware sooner for review to get views ) but that's an other discussion. 

 

Having said that since it is what it is if you got 600-1000 to spend only for the GPU you got 100-200$ to spend on the ram. 

 

13 minutes ago, Egg-Roll said:

while not as good as the 3080

well that is your opinion as far as my opinion goes I see from all the metrics Ive seen and taken into account how those have been made I believe that  6800 xt is at least as good as the 3080 if not better especially in a ryzen 5000 system with more than 16GB of fast ram. 

 

13 minutes ago, Egg-Roll said:

this whole video feels like it was designed to tear apart the 6800 XT while promoting the 3080

I agree on that one. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, papajo said:

well that is your opinion as far as my opinion goes I see from all the metrics Ive seen and taken into account how those have been made I believe that  6800 xt is at least as good as the 3080 if not better especially in a ryzen 5000 system with more than 16GB of fast ram. 

I was referring to the DXR being turned on, their testing showed 79 vs the 92. Perfectly playable, only reason why the 3080 is better is due to the better RT cores. I too think it is equal in every way outside of RT and in some cases better. It's why I find it suspicious they didn't display the rig setup for the testing.

 

1 hour ago, papajo said:

It has gotten cheaper although I agree that PC parts have ridiculous pricetags nowadays (all a result of "free" youtube lip service and passive aggressive, and sometimes just aggressive, promotion from youtubechannels who instead of being tech journalists supporting the interest of the consumer suck upon the corporate tit to get that free gear or this pieces of hardware sooner for review to get views ) but that's an other discussion. 

 

Having said that since it is what it is if you got 600-1000 to spend only for the GPU you got 100-200$ to spend on the ram. 

Considering my first card (BFG 7600 GT or 7800 GT) which was $200 at the time of building if my memory serves me correctly paying $600 for what you get now is pennies in comparison lol. Tho then again the R9 270 was also around $200 too... It also didn't help that cyrpto took over the industry as well which also hindered on price, too many factors took over, tho when I bought my 1070 new it was about $600 I think before crypto blew up so the pricing has stayed relatively level for card tiers at least. Not cheap for sure but can't complain for the performance and one still can build a decent system for under $1200 no monitor.

 

1 hour ago, papajo said:

That was my point. (in other words I believe it was intentionally so because it doesnt need much prep or post editing to put the names in a better order especially from people that do that job for a few years and done it right in the past) 

Considering the crew size of LMG there isn't any excuse for this kind of video to exist esp when a smaller workforce channel can produce better quality content. I would like to compare LMG to TBBT, but I can't even do that because at least they fact checked their show before proceeding 😏

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Egg-Roll said:

Considering my first card (BFG 7600 GT or 7800 GT) which was $200 at the time of building if my memory serves me correctly paying $600 for what you get now is pennies in comparison lol. Tho then again the R9 270 was also around $200 too... It also didn't help that cyrpto took over the industry as well which also hindered on price, too many factors took over, tho when I bought my 1070 new it was about $600 I think before crypto blew up so the pricing has stayed relatively level for card tiers at least. Not cheap for sure but can't complain for the performance and one still can build a decent system for under $1200 no monitor.

Say that again bro I hear ya... 

 

Exactly! There were times not so distant where one could spend 1000$ on an entire tower and get the best gaming performance possible or very close to that (depending on how far back you go in time) 

 

now you need that much money only for a single component (e.g the graphics card ) and you still wont have the best one

 

I remember GTX 1080 ti costing over 1200$ that couldnt even play viably the games of that generation at 4k (a standard resolution that existed quite a few years before even Pascal existed in the drawing plans of nvidia) 

 

They ask more and more and try to give less and less for example now you pay 1500$ (if you are lucky and find one that is and if you are even more lucky and the one you found doesnt cost 2000+$ ) for a RTX 3090 and it doesnt even have a usb type c port for VR I mean how expensive would it be to add one? yet they just made a strange looking cooler (supposedly better than stock but really nothing to write home about) only to hide the cut down PCBs they sell to you to save cost (cheaper vrm less components they even cheap out on the GPU capacitors which lead to the crashes when OCed) ... my GTX 650 ti has a bigger PCB than the 3090 FE lol 

 

and trying to sell you all those "cool" techs like DLSS and other crap that only allows them to sell to you slower stuff that can perform better because you degrade image quality, but they sugar coat it with "AI" "RTX" and stuff like that to make you feel better. 

 

I really dont like the AMD prices  as well but they are atleast cheaper and give you some extra value (like usb type C /virtualink 3 fans no need to upgrade PSUs and getting adaptors etc) 

 

 

Channels should push more FPS to dollar ratios and value instead of being all the time so super hyped for small advancements that need tons of money to be purchased.

 

And it's not mining its far more funnier and simpler than that... Jensen thought that if he starts to wear a leather bikers jacket and make presentations ala Steve Jobs his company can become like Apple and sell at huge profit margins... and he was right he succeeded .. Prices started to become crazy once Jensen started to be  Jobs wonnabe really...  🤣

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Egg-Roll said:

Maybe you watched Jays or GN by accident?

 

-whole lotta hogwash snipped-

Nope. Only watched LTT. 

 

I skimmed thru what you said and I mean, I can kinda see where you're coming from? But it's like you're operating in a vacuum where no one knows anything and lacks any critical thinking skills. I'm sure the good majority of people watch it are already familiar with the RTX series and I would hope are well aware that this is AMDs first venture into RT. Like I'm not gonna nitpick everything you said but it works like this. There's gonna be three groups of people.

 

1) The people who know enough before hand and can think critically. They come into the video have previous knowledge of things and can make judgements based on information given and not take everything presented as gospel. 

 

2) The hardcore people. The ones who watch multiple videos or reviews on something before buying it. Even if LTT did a 100% thorough video on it, it doesn't matter much cus they're also gonna be watching x, y and z for more info.

 

3) The average consumer who doesn't give a shit. One thing I've noticed while being on LTT forums is that everyone consistently overestimates what average consumers are doing. When I asked about a laptop rec here for school (as i haven't dealt with laptops in ages), the one I gave as what I was looking at was met with "This is absolute dog shit, get a second hand thinkpad or something else, it's SHIT". Which is funny cus at that time there were literally no reviews of it out so how would they know lol. So I bought it. 99% of the people who buy it would love this thing. They might watch this LTT vid and go "oh nvidia is better" and buy that. Or they may make a post on reddit and go off the first recommendation there. Either way, whatever they get is gonna suit them 1000000%. 

 

in fact, imo, people are given too much info these days. the whole nvidia reflex bs and constantly pushing high refresh monitors is so tiring. it works to drive up sales cus you have every person with half a brain cell going "oh no i lost and i could feel the 0.01ms jitter, gotta upgradeEEEEEEEE". don't even get me started on the damage DF has done to gaming.

 

Only thing that's really needed is if it was comparable to other video cards out. if yes, it's fine. the end.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, bit said:

But it's like you're operating in a vacuum where no one knows anything and lacks any critical thinking skills.

Like you I've lived threw 2020, I'm not in a vacuum 😉 If critical thinking was part of everyone's daily life families would be getting together for x-mas this year. Sadly like critical thinking skills no one will should be getting together in most parts of the world this x-mas 😰.

 

1 hour ago, bit said:

I would hope are well aware that this is AMDs first venture into RT. Like I'm not gonna nitpick everything you said but it works like this.

The issue is Linus is a influencer, and by making claims as he did in the video could/can/has caused misinformation and issues in the past/or future. While I doubt this kind of video will cause LMG issues in anyway continuing to mislead his followers can lead to the channel being uncredible. What I saw him do was make a extreme claim then took a hammer and kept bashing at the 6800 XT till it was no longer functioning less one or 2 points, making the 3080 look like a god in comparison not giving the 6800 XT a fair chance for what most people would actually use the card for. Also Linus should never assume like you just did, he should have mentioned it was AMDs first attempt at RT, he failed to do this so someone new to AMDs GPUs for the past 3+ years (say someone who bought a 1000 series card like me from nvidia), could easily assume AMD couldn't keep up with nvidia instead of it being their first time.

 

Being a influencer has a heavy burden, being one sided is not a good idea. However this video was clearly heavily sided towards nvidia. LMG doesn't need to be paid by nvidia to be called a nvidia shill, they don't need to get their products for free either (however in this case they clearly did), they just have to believe nvidia is the superior company, think iPhone and Samsung S series users. At which time AMD will never get any recommendations. So no my comment isn't "hogwash" just because Linus clearly was being partial towards nvidia whether it be for financial gains or personal opinions it doesn't matter he was to review the cards from AMD properly, which he failed to do.

 

I do know one thing but it doesn't absolve Linus from anything, that is Anthony wrote the 6800 XT script, he was also the one who reviewed and wrote the 5700 XT last year of which was titled "Why NOT to buy Radeon 5700 XT… Yet – Our Review" and was linked above by me. The ironic part is the click bait title actually not only contradicts the review in total but gave the card (that I ultimately ignored the issues given and bought) a more fair unbiased review than they did with the 6800 XT, if you haven't seen it I strongly suggest you watch it and compare the 2 side by side if needed. No joke please just patronize me on this and watch it.

Now you tell me what is wrong with the 6800 XT review compared to this one. I can tell you at least one thing, the script should never have gotten approved for the 6800 XT.

 

1 hour ago, bit said:

3) The average consumer who doesn't give a shit. One thing I've noticed while being on LTT forums is that everyone consistently overestimates what average consumers are doing. When I asked about a laptop rec here for school (as i haven't dealt with laptops in ages), the one I gave as what I was looking at was met with "This is absolute dog shit, get a second hand thinkpad or something else, it's SHIT". Which is funny cus at that time there were literally no reviews of it out so how would they know lol. So I bought it. 99% of the people who buy it would love this thing. They might watch this LTT vid and go "oh nvidia is better" and buy that. Or they may make a post on reddit and go off the first recommendation there. Either way, whatever they get is gonna suit them 1000000%. 

It's clearly not just the people of the forums who overestimates, Linus clearly did that too with this video as well with his closing statement. I'm not joking who gives a shit what nvidia is working on with their RTX crap? It is built around streamers in mind not everyday users, at the end of the day the 6800 XT is a equal on most or many everyday tasks for the everyday person. If you stream or game play record well nvidia can be the one to go to, but while I've never streamed I've recorded a long game play sessions using the R9 270, 0 issues, none whatsoever. That is a card that is older and cheaper than any card in this video, so I dare to ask how does Linus know that the 6800 XT will suck at streaming and game capture? He doesn't. He's assuming because it doesn't have the built in encoders like nvidia.

 

The issue is you went against peoples advise here, because they are not Linus. I don't know the laptop you where looking at and it's purpose so I can't pipe in (not going to search for it either), however I disagree with the 99% will love the thing they buy, esp when it goes based on this video. Because if someone sees this video today instead of going "ok they are equal for daily non serious gamers who don't need to make a living off of the card" but go "Oh shit AMD is utter shit because our lord and savoir has spoken by not recommending it must buy nvidia now", that person might end up going scalper route, even if not and instead of waiting and getting a card that will work for them for years for less they decide to buy the 3080 from A 6800 XT REVIEW they would have wasted money. Could they be happy? Yes they could be, but how happy would they be if they find out Linus basically swindled them into buying the 3080 because he gave a unfair review of the 6800 XT?

 

Also Reddit? 🤣 Where the hell on Reddit would you post such a question? PCMR is almost pro nvidia the nvidia subreddit is going to be as biased as AMDs...

 

Oh yea while you are at it, why not just buy a triple 3090 SLI because that will suit EVERYONE. Even tho that would be completely overkill for 99.99999999999...99999999999999999% of the people, it would technically suit everyone. That's the issue when a influencer like Linus screws up, it leads to bad purchase discussions because people lack the ability to think for themselves and what he did in this video if it was 2 years ago could have had a negative impact due to his popularity and ability to go viral on AMDs sales, luckily for him 2020 has been a shitty year for most but a great year for all tech and getting things sold out instantly so this masterpiece of 💩 will be buried with better and proper reviews before it could do any damage to AMDs sales numbers.

 

2 hours ago, bit said:

in fact, imo, people are given too much info these days. the whole nvidia reflex bs and constantly pushing high refresh monitors is so tiring. it works to drive up sales cus you have every person with half a brain cell going "oh no i lost and i could feel the 0.01ms jitter, gotta upgradeEEEEEEEE". don't even get me started on the damage DF has done to gaming.

I bought my first and maybe only 144 hz monitor this year, beyond that the 75 hz ones served me just fine and I have never had issues in any game. Now your issue with monitors applies with GPUs as well and is what Linus did in this video using the equivalent of "pushing high refresh monitors" in the GPU world via nvidias lead in software and RT abilities. Also remember Linus and other influencers are part of the issue of the continuous push towards high refresh monitors, it's a deadly cycle. Producer of product gives to media, media brags to users, users buy product, sales information gets back to producer, repeat. What happened here was AMD gave Linus card, Linus gave middle finger to AMD, Linus told users to buy nvidia, AMD laughs because they still sold out, but the damage has been done for both ends. By not telling the truth or giving a honest opinion it will hurt Linus more than AMD in the end, at least this time round.

 

Read some of the YT comments, they mock Linus about the whole did not disappoint, but don't recommend. If they didn't disappoint why not recommend?

 

2 hours ago, bit said:

Only thing that's really needed is if it was comparable to other video cards out. if yes, it's fine. the end.

I fully agree, but that is not what happened in this video. Jay did it GN did it, Linus did not do it. Like I said in the post above Linus clearly didn't even fact check his own "bashing" on AMD vs last years model for productivity, so what else has he missed? I can tell you today the 6800 XT is the equal to nvidias 3080 as long as you are not a streamer or someone who relies on a card to make you money, which applies to 90% of all end users likely watching this video. Comparing RT between them is unfair since RT is A still too new and B 2nd gen at least for nvidia, so I guess if you care more about RT than anything else get a 3090. TA DA! 🙌 I just did a better job then Linus did in the video.

Link to comment
Share on other sites

Link to post
Share on other sites

or to sum it up another way. dlss means said hardware that has it cannot push the rez native with out making it un playable.

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, papajo said:

I was talking about the technology it self .

 

But that game you mentioned is like the only most buffed up RTX tm (but still simple and I wonder why they picked minecraft 😙) one of the few that even support some sort of ray tracing its a very limited list maybe even smaller than the one of games supporting SLi...  

 

And that's my point if they want to bring raytracing let them bring real ray tracing not partially ray tracing with image quality degradation to an "unnoticeable" degree (sound pretty much like "your eyes can only see 30 FPS" console gamers used to say ) to have a decent FPS

 

The thing is its about image quality not just how you get there.  Listening to many reviews there are many instances where native raytracing is completely unplayable between 20-30fps on AMD and 30-40fps on NVIDIA but performance jumps to 60-70+ with DLSS.  The first revision on DLSS 1.0 was horrible but the quality is very good with most 2.0 titles.  If you don't care about it at all at this point AMD is the best bet.  Its all about what is important to you both have serious advantages over the other.

 

Other ideas:  I've played around a bit with the new 3dmark raytracing benchmark.  The performance is pretty bad with the default settings but if you reduce the samples it jumps to 100fps.  I can't really tell the difference really so some kinda game may be being played... 

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

The 6800 is 'better' than the 3070... but here in the Uk is approx. £80 to £100 more expensive. (although that will become clearer with AIB launches)

 

Is it actually worth £80 to £100 more? eh...... I'm not sure it is for the 10-15% additional performance for the circa 20% price uplift. Plus debates about effectiveness of RT and DLSS aside, at least on the 3070 you get a more polished effort you can choose to use. All that plus CUDA and streaming I think the 3070 is a better rounded, general purpose card. Of course if you are all about the raster fps then AMD maybe the better card.... and there is a place for that for sure.   

 

I doubt anyone would be upset with either card, but IMO the 3070 is better positioned more for the 'general hobbiest'. 

 

Now a 3070 super for £50-£80 more than the 3070.... that's an interesting proposition. 

 

I reckon Jan 25.... 

 

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, GDRRiley said:

I'm saying given the 1.5-2x ram prices waiting for zen4 is silly.

its going to be zen5 before DDR5 drops in price. the early capacity always goes to servers

 

You may not have been around in the swap over from DDR3 to DDR4 but there were dual Mobos at the time Asrock for example where you could use eith DDR3 or DDR4. no guarantee but the same may be possible for the ddr4 to ddr5 change. But  your argument  reinforces my original argument that money spent upgrading now  "FOR ME" is wasting money that could be used in 12-18 months for a much better upgrade especially as my current system isnt noticeably holding me back at the moment.

Quote

AMDs shipping 1 million this year. Supply is growing, Q1 should be easier to get them.

 

 

great , but you missed my point , time spent waiting  to be able to buy a dead end platform isnt enticing and just reduces the effective wait to the nextplatform which will have an upgrade path and is therefore a much better investment "FOR ME".

 

Quote

Why are you talking about upgrading after 1 gen

if you are buying a 700$ GPU you've long forgotten about getting a good value. 1 step down cards use to be 350-400$ now its 500-600$

Again , I said theres no point "FOR ME" spending 700+ on a 6000card with crippled performance in raytracing when ray tracing is likely to be the future of gaming. the lack of performance of the 3090 compared to 3080 implies nvidia are going to have their work cut out  improving their performance . FOR ME , my judgement  is that i'll wait until the second generation of AMD GPU when they will have refined the raytracing performance along with the software.

Quote

zen3 is going to have plenty of performance for longer term build.

Comeon ..... its a dead end . it may be great for you running 3 years behind the times but there are different use cases requiring different performance and upgrade paths. FOR ME , its my work AND leisure rig so i NEED to have the performance for work and when the performance is holding back my work , I NEED to be able to upgrade cost effectively. As my current rig is currently not holding me back , a sensible  decision FOR ME is to wait for zen 4.

 

Quote

piecemeal is a way better way to go. I bought a new CPU+motherbaord+ram this year. a 2700x, B450, 32gb. I'm still running an RX580.
the only reason I'm considering a new CPU is because I got a cheap B550 board and if the 3900x goes on fire sale why not.

It took you 3 years to get similar performance than i've had for the last 3 years.Which demonstrates the flaw in your advice FOR ME. For people running on a budget different decisions will be involved compared to other use cases.  For you, zen 3 will be a viable upgrade as you already have the mobo /ram and can wait for Zen 3 prices to drop once the next generation  is released.FOR ME i'm going to wait

Capture.PNG

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Keith_MM said:

You may not have been around in the swap over from DDR3 to DDR4 but there were dual Mobos at the time Asrock for example where you could use eith DDR3 or DDR4. no guarantee but the same may be possible for the ddr4 to ddr5 change. But  your argument  reinforces my original argument that money spent upgrading now  "FOR ME" is wasting money that could be used in 12-18 months for a much better upgrade especially as my current system isnt noticeably holding me back at the moment.

great , but you missed my point , time spent waiting  to be able to buy a dead end platform isnt enticing and just reduces the effective wait to the nextplatform which will have an upgrade path and is therefore a much better investment "FOR ME".

 

Again , I said theres no point "FOR ME" spending 700+ on a 6000card with crippled performance in raytracing when ray tracing is likely to be the future of gaming. the lack of performance of the 3090 compared to 3080 implies nvidia are going to have their work cut out  improving their performance . FOR ME , my judgement  is that i'll wait until the second generation of AMD GPU when they will have refined the raytracing performance along with the software.

Comeon ..... its a dead end . it may be great for you running 3 years behind the times but there are different use cases requiring different performance and upgrade paths. FOR ME , its my work AND leisure rig so i NEED to have the performance for work and when the performance is holding back my work , I NEED to be able to upgrade cost effectively. As my current rig is currently not holding me back , a sensible  decision FOR ME is to wait for zen 4.

 

It took you 3 years to get similar performance than i've had for the last 3 years.Which demonstrates the flaw in your advice FOR ME. For people running on a budget different decisions will be involved compared to other use cases.  For you, zen 3 will be a viable upgrade as you already have the mobo /ram and can wait for Zen 3 prices to drop once the next generation  is released.FOR ME i'm going to wait

 

why the hell are you using the BS site userbenchmark

 

I was around during DDR4/DDR3 swap. it was DDR3L or DDR4. that relied on intel putting DDR3L controllers in skylake for mobile

it isn't likely to happen again.

 

I went from a 3rd gen mobile quad core i7. to a skylake i5 6600k (worst chip I ever bought, a 6700k or a 4790k/5 would have been way better) to a 2700x

I've got a spare mobo yes but I don't have enough spare DDR4.

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, GDRRiley said:

why the hell are you using the BS site userbenchmark

 

I was around during DDR4/DDR3 swap. it was DDR3L or DDR4. that relied on intel putting DDR3L controllers in skylake for mobile

it isn't likely to happen again.

 

I went from a 3rd gen mobile quad core i7. to a skylake i5 6600k (worst chip I ever bought, a 6700k or a 4790k/5 would have been way better) to a 2700x

I've got a spare mobo yes but I don't have enough spare DDR4.

it was the first comparison i googled . heres the next  one https://www.techadvisor.co.uk/feature/pc-components/intel-core-i7-8700k-vs-amd-ryzen-2700x-3679686/ . Pretty much saying the same thing they have pretty similar performance and the 2700x wasnt released until 6 months after i bought my 8700k.

 

googling shows AMD didn't have dual mobos which makes my argument to put cash towards a DDR5, zen 4system in 12-18 months as the sensible option for my particular case . Others may think differently taking into account their own needs, current mobo/chip and budget.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Keith_MM said:

it was the first comparison i googled . heres the next  one https://www.techadvisor.co.uk/feature/pc-components/intel-core-i7-8700k-vs-amd-ryzen-2700x-3679686/ . Pretty much saying the same thing they have pretty similar performance and the 2700x wasnt released until 6 months after i bought my 8700k.

Don't trust any review sites beyond the basics, and even then. Simple reason being is this:

image.thumb.png.3fd3302c85ec1acec6395c460797aaed.png

There is no 6 core option and most of the time 8 core options are either inaccurate or nonexistence as per UBM. This image shows the 2700X vs the 8700K, one has 8 cores the other has 6, so why isn't the 2700X kicking its ass by at least 20%?

 

A more complex reason why one shouldn't trust places like UBM or even TA is multi-tasking. Yes Intel wins at Fortnite and CS:GO among others paired with the right GPU however the 2 fewer cores and 4 fewer threads means you can do fewer things at once, or more specifically leave less running in the background. For anyone who wants to play games and listen to youtube, or simply not need to worry about closing everything to game this matters, and tests don't show this. The tests only show what happens when you do one thing or another and don't show real world events. They are great baselines (sometimes) but you have to look at what you do, close everything to game or work? then fine, do any sort of multi-tasking? Ignore most reviews. This also includes using Discord while gaming.

 

Cores nowadays have more meaning then they did 5+ years ago, today you can utilize all 8 cores where as a 8 core CPU like the old FX series were more show than anything since almost nothing knew what to do with the additional 4 cores. So days end if you do any sort of multi-tasking imo the 2700X is the clear winner. It's one reason why I went AMD with the 3800X, it was cheaper(and weirdly cheaper than the 3600 I was originally looking at) and offered more cores/threads than the intel equal, something I knew I could make full use of.

 

That is also why Intel is having a meltdown right now because AMD is able to actually pull things off w/o much issue

24zMDBN.gif

 

Now the fact you bought the 8700K before the 2700X was released why even compare them, next to trying to justify your purchase that is? That would be like if I compared my 1070 with the 1070 Ti because I bought it 6-8 months before the 1070 Ti was released, you made a choice before a better option came out. That happens, personally I would have still gone with a 1700X over the 8600K, but that is me.

9 hours ago, GDRRiley said:

to a skylake i5 6600k (worst chip I ever bought, a 6700k or a 4790k/5 would have been way better)

Won't lie the 6600K for me was a disappointment as well, but I knew it wasn't going to be the best of the best and offered only a mild upgrade from my 3570K, I just wanted to dump the 3570K and couldn't justify upgrading higher at the time.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Egg-Roll said:

Won't lie the 6600K for me was a disappointment as well, but I knew it wasn't going to be the best of the best and offered only a mild upgrade from my 3570K, I just wanted to dump the 3570K and couldn't justify upgrading higher at the time.

I think I had a below average chip parried with a board that could not deliver enough power for a weak quad core. I struggled to hold 4.1ghz at 1.325V

the other issue is I play too many simulators and other CPU heavy games, the extra threads would have helped. hell I wish I could have got x99 but m-ATX is dead on HEDT and I didn't have the budget

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Egg-Roll said:

 

Now the fact you bought the 8700K before the 2700X was released why even compare them, next to trying to justify your purchase that is? That would be like if I compared my 1070 with the 1070 Ti because I bought it 6-8 months before the 1070 Ti was released, you made a choice before a better option came out. That happens, personally I would have still gone with a 1700X over the 8600K, but that is me.

 

Simply because GDRRiley was saying that buying piecemeal is the way to go and giving the example that he had upgraded to a 2700x this year. I was pointing out he waited 3 years to get similar performance than i have been using for the last 3 years and for my work is sufficient to last another  12-18months at least.

 

The performance of the current amd chips is nice but as an investment in technology to go through the next 4-5 years they are in my opinion a poor investment FOR ME. I personally am going to wait for am5, zen 4 and evaluate the next gen of GPUs before investing in a new rig with an upgrade path and performance to last 4-5 years at least.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, GDRRiley said:

I think I had a below average chip parried with a board that could not deliver enough power for a weak quad core. I struggled to hold 4.1ghz at 1.325V

the other issue is I play too many simulators and other CPU heavy games, the extra threads would have helped. hell I wish I could have got x99 but m-ATX is dead on HEDT and I didn't have the budget

Could it be the PSU? Because assuming it is using the same board, I'm also using the z170 chip set and mine is comfy at 4.10Ghz 24/7, tho ASUS might have gotten cheap on that particular board so I won't rule that out. If neither then you just got unlucky with a shitty chip, which sucks.

 

Yea simulators are a pain in the ass to run on the 6600K I think thats the main reason why Satisfactory loves crashing so much because the chip can't deal with it and a browser open at the same time. With others I also notice performance loss with other stuff open too, sometimes significant. Kind of happy with the upgrade I made, like I'm sure you are with the 2700X.

 

1 hour ago, Keith_MM said:

Simply because GDRRiley was saying that buying piecemeal is the way to go and giving the example that he had upgraded to a 2700x this year. I was pointing out he waited 3 years to get similar performance than i have been using for the last 3 years and for my work is sufficient to last another  12-18months at least.

 

The performance of the current amd chips is nice but as an investment in technology to go through the next 4-5 years they are in my opinion a poor investment FOR ME. I personally am going to wait for am5, zen 4 and evaluate the next gen of GPUs before investing in a new rig with an upgrade path and performance to last 4-5 years at least.

Everyone's financial situation is different, esp when building a new computer, so depending when they built their computer they might not have been able to justify the 3000 series, and no way would justify any Intel as well. The 2700 X dipped to $160 USD on Amazon on Boxing Day last year where as your chip was floating about $100 more at said time. The 3600 (non-X) was floating about $175, the issue is the 2700 X has 2 more cores and 4 more threads so it would actually in theory out preform the 3600 if you can use those cores. Assuming that is what happened buying the 2700 X was the smarter choice over buying something with fewer cores even tho it is technically older, so for a computer if like me (which looks like might be the case) that will last 3 years not 4-5 (4 is not out of the question) it is a really good investment. Zen 4 should come out in year 2 of owning said computer, and if the new processor needs DDR5 it gives the market another 1-2 years to produce enough chips to lower prices as people may not want to pay the potential adopter tax that comes with it.

 

For me that is what I'm hoping for, by the time in 2-3 years when I want to upgrade my own 3800X (assuming the world doesn't go into a global recession) zen4 will be out and reasonably priced, or maybe Intel will regain their reign on the market and I hop back to them. The point is while you can justify waiting for the new chips many may not, and trust me the 6600K is not worth using for another 2 years for any reason less office work. At the time of building my computer I knew AMD was making great strides and building a computer then I knew would be more than likely obsolete in as little as 1-2 years, I still went out and built it anyways because I was using something like GDRRiley and it was slowly driving me insane. Me today would have gone the route they did because the 2700 X is still a great chip esp at the price, only reason why I didn't back then was because I could justify the extra cost, but my performance gains are not worth the extra costs. Today if someone was hell bent on a Intel chip and never want to touch AMD I would tell them to buy a 8th gen chip like yours if it is significantly cheaper because buying anything newer is a waste of money till Gen 11 comes out, and even then it might be a complete waste still.

 

You will be waiting for Zen 4 for 2 more years maybe even 3, AMD is just cashing in on Intels inabilities right now, if you can justify waiting on that long then so be it. The 2700 X and my 3800 X are up to the challenge due to the 8 cores 16 threads and what at least I use mine for, maybe your 6 core 12 threads will work for you till then too, or maybe it will slowly drive you into collapsing into buying a 5000 or worst the series chip after it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Egg-Roll said:

Could it be the PSU? Because assuming it is using the same board, I'm also using the z170 chip set and mine is comfy at 4.10Ghz 24/7, tho ASUS might have gotten cheap on that particular board so I won't rule that out. If neither then you just got unlucky with a shitty chip, which sucks.

 

Yea simulators are a pain in the ass to run on the 6600K I think thats the main reason why Satisfactory loves crashing so much because the chip can't deal with it and a browser open at the same time. With others I also notice performance loss with other stuff open too, sometimes significant. Kind of happy with the upgrade I made, like I'm sure you are with the 2700X.

an RM750X no. it was a Z170m plus from asus

I leave way more than a game and browser open. I'm so lazy I'll often leave open photo edits and then start gaming.

at 250$ for a board+CPU yeah it was a great deal.

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

GN posted their 6800 review 2 days ago lol, I missed it back then:

 TL:DR It's basically the poor mans 6800XT/3080 (or close to) when OC'd or a 3070 equal when on sale, w/o taking RT into consideration *Game dependent

1 hour ago, GDRRiley said:

an RM750X no. it was a Z170m plus from asus

I leave way more than a game and browser open. I'm so lazy I'll often leave open photo edits and then start gaming.

at 250$ for a board+CPU yeah it was a great deal.

ok, I'm guessing you took the 32GB over as well and didn't suffer with only 16GB, I hope anyways as that would have been brutal lol. I try to close nothing myself as I hate loading times, half the time I even forget I have things still open 😳

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Egg-Roll said:

 TL:DR It's basically the poor mans 6800XT/3080 (or close to) when OC'd or a 3070 equal when on sale, w/o taking RT into consideration *Game dependent

Quote

Did you see the review? 

 

It trades blows with a 3080 toe to toe (actually it beats a 3080 in most cases but only by a small amount) and when OCed sometimes beats even a 3090! 

 

So it is a poor mans 3090 rather. 

 

But that's fine since it doesnt compete with the 3090 directly that will be RX 6900 XT's job which will be launched on December. 

 

EDIT sorry the video you uploaded was about the 6800 NON XT (so just plain rx 6800)  and just had graphs for the 6800 XT... yea the 6800 NON xt could be considered as a poor mans 3080...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, papajo said:

Did you see the review? 

 

It trades blows with a 3080 toe to toe (actually it beats a 3080 in most cases but only by a small amount) and when OCed sometimes beats even a 3090! 

 

So it is a poor mans 3090 rather. 

 

But that's fine since it doesnt compete with the 3090 directly that will be RX 6900 XT's job which will be launched on december. 

I was referring to the 6800 not the 6800 XT(curse you AMD and your stupid naming schemes), the video I linked was the 6800 review ;) It got no where near the 3090 but got close or surpassed the 6800 XT and 3080 at stock speeds.

 

3070 for RT only

6800 for poor mans 6800 XT or 3080 (no RT, less a few titles), or a great non RT card to out preform the 3070 esp when on sale. Still yet to be seen if it can be flashed with XT FW.

3080 for RT

6800 XT a competitor for 3080 or poor mans 3090 less RT similarly to the 6800 is to the 3080/6800 XT.

3090

6900 XT TBD

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, papajo said:

actually it beats a 3080 in most cases but only by a small amount

Only at 1080p/1440p. At those resolutions it does have a slight edge but at 4K the 3080 does seem to have an advantage based on what I’ve seen.

 

Honorably in pure raster the two are so closely matched it could be game optimizations making the difference. Idk tho

Current System: Ryzen 7 3700X, Noctua NH L12 Ghost S1 Edition, 32GB DDR4 @ 3200MHz, MAG B550i Gaming Edge, 1TB WD SN550 NVME, SF750, RTX 3080 Founders Edition, Louqe Ghost S1

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Hymenopus_Coronatus said:

Only at 1080p/1440p. At those resolutions it does have a slight edge but at 4K the 3080 does seem to have an advantage based on what I’ve seen.

 

Honorably in pure raster the two are so closely matched it could be game optimizations making the difference. Idk tho

Hardware unboxed had a more comprehensive review on the xt here

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×