Jump to content

AMD Radeon VII Benchmark/Launch Mega Thread

Taf the Ghost

So does this mean i can finally grab a 2080TI without killing my wallet?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bcat00 said:

So does this mean i can finally grab a 2080TI without killing my wallet?

Nope. Your wallet will still need to be sacrificed to Jensen's yearly bonus.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Taf the Ghost said:

Nope. Your wallet will still need to be sacrificed to Jensen's yearly bonus.

Sigh, i just knew it

God can someone just burn AMD and dismantle this crappy company already....

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Taf the Ghost said:

I know everyone mentions it in the Review Space, but the market response to the RTX line is pretty clear: no one actually cares about it.

Better and more realistic lighting is something I love, and want in games. Just RTX doesn't "just work", and the performance to price for it, is not worth it; and I'm hardly one that should even be talking about that given my purchase history. :P

 

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Valentyn said:

Better and more realistic lighting is something I love, and want in games. Just RTX doesn't "just work", and the performance to price for it, is not worth it; and I'm hardly one that should even be talking about that given my purchase history. :P

 

Nvidia screwed up the marketing, to an extent. GDDR6 + the massive dies meant they had to up the prices to maintain margins, so they've gone full marketing with the RTX stuff because it's a way to try and convince people they aren't paying the same price for the same performance.

 

The real issue is that all of the cool new stuff with Turing, which will eventually kill off the Pascal cards, hasn't been their focus. Also, they really needed a better set of tech demos/games earlier than can use it. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Taf the Ghost said:

Nvidia screwed up the marketing, to an extent. GDDR6 + the massive dies meant they had to up the prices to maintain margins, so they've gone full marketing with the RTX stuff because it's a way to try and convince people they aren't paying the same price for the same performance.

 

The real issue is that all of the cool new stuff with Turing, which will eventually kill off the Pascal cards, hasn't been their focus. Also, they really needed a better set of tech demos/games earlier than can use it. 

kinda agree but then that has issues of leaks of what features xxxx company is working on isnt safe with all those developers

 

slower process is good for everyone so it doesnt render our stuff junk(somewhat useless) if a huge chunk of new games adopts it and blind sides us all

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, pas008 said:

kinda agree but then that has issues of leaks of what features xxxx company is working on isnt safe with all those developers

 

slower process is good for everyone so it doesnt render our stuff junk(somewhat useless) if a huge chunk of new games adopts it and blind sides us all

The new stuff in Turing, that isn't RTX features, will benefit both Nvidia's Turing & all GCN-based AMD cards. That's a bit of the issue, as Nvidia will be helping AMD's FineWine by finally using Async Compute.

Link to comment
Share on other sites

Link to post
Share on other sites

What a POS is Radeon 7...nvidia probably like: ''well yeah, fuck that...let's rise RTX 2080 and 2080ti prices by 70$ this morning''

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Taf the Ghost said:

The new stuff in Turing, that isn't RTX features, will benefit both Nvidia's Turing & all GCN-based AMD cards. That's a bit of the issue, as Nvidia will be helping AMD's FineWine by finally using Async Compute.

and developers didnt want to incorporate it to leave out that mass nvidia market share

now they can, and rt will be next once amd gets on board with their solution

 

its kinda like dx9 to dx11, many devs stayed on dx9 just because they dont want to lose out on potential customers becuase was only was for dx11 cards, some even went to make dx9 version, same thing is happening to dx11 to 12 now, old stuff just needs to get weeded out and that takes time

 

but you also kinda need to put the next stepping stones in place for the future like rt/etc

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Bcat00 said:

Sigh, i just knew it

God can someone just burn AMD and dismantle this crappy company already....

you are right lets burn the little amount of competition left in the market because they are a few percent of, 

so that your nvidia overlords can charge even more.

 

Spoiler

what you’ve just said is one of the most insanely idiotic things I have ever heard. At no point in your rambling, incoherent response were you even close to anything that could be considered a rational thought. Everyone in this room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul.

Billy Madison

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, i_build_nanosuits said:

What a POS is Radeon 7...nvidia probably like: ''well yeah, fuck that...let's rise RTX 2080 and 2080ti prices by 70$ this morning''

I don't think so. it is keeping pace or outpacing the 2080 in some games, at the same price.

 

It is quite impressive they can do the same Vega architecture on 7nm and get these results. Hopefully with the new Navi architecture we will see a big jump in performance.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, maartendc said:

I don't think so. it is keeping pace or outpacing the 2080 in some games, at the same price.

 

It is quite impressive they can do the same Vega architecture on 7nm and get these results. Hopefully with the new Navi architecture we will see a big jump in performance.

 

how many reviews have you watched?

This card is 700$...consumes more power than a 2080ti...in fact more than any nvidia cards...doesn't have ray tracing, no tensor core, no DLSS...nowhere near the same level of suport and optimization from game developers etc. you'd have to be NUTS...and i mean it...completely NUTS to blow 700$+ on this crap of a card instead of an RTX 2080.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, i_build_nanosuits said:

how many reviews have you watched?

This card is 700$...consumes more power than a 2080ti...in fact more than any nvidia cards...doesn't have ray tracing, no tensor core, no DLSS...nowhere near the same level of suport and optimization from game developers etc. you'd have to be NUTS...and i mean it...completely NUTS to blow 700$+ on this crap of a card instead of an RTX 2080.

2080 Ti draws more power and no one cares about RTX.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Taf the Ghost said:

2080 Ti draws more power and no one cares about RTX.

yeah...no...ohh god...buy one then, quick!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

IMHO AMD should have dropped HBM and redesigned GDDR5X/6 for Radeon VII to make it cheaper. 2+ years old performance for $700 or more is not something I would pay for it. I mean, I already bought 1080Ti for €680 back in september last year. Hopefully Intel can develop better GPU's to tackle nVidia as AMD is not doing very well in the high-end segment for consumers. Or AMD should lower it to $499 to make it much more attractive.

 

nVidia is just cancer with their RTX and Tensor cores. Why make the RTX cards damn expensive if they want to make ray tracing more common?

DAC/AMPs:

Klipsch Heritage Headphone Amplifier

Headphones: Klipsch Heritage HP-3 Walnut, Meze 109 Pro, Beyerdynamic Amiron Home, Amiron Wireless Copper, Tygr 300R, DT880 600ohm Manufaktur, T90, Fidelio X2HR

CPU: Intel 4770, GPU: Asus RTX3080 TUF Gaming OC, Mobo: MSI Z87-G45, RAM: DDR3 16GB G.Skill, PC Case: Fractal Design R4 Black non-iglass, Monitor: BenQ GW2280

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, i_build_nanosuits said:

doesn't have ray tracing, no tensor core, no DLSS...

I mean, that is like saying, it doesn't have PhysX or HairWorks, or Ansel, or whatever. Who cares? Just more Nvidia marketing hype things that will not get supported by games and die eventually.

 

The only game I currently play is Battlefield V, and it is 8% faster in that title at the same price (and Battlefield V is an Nvidia "sponsored" title, mind you!). So I would buy the Radeon VII if I was in the market for a $700 card. And no, nobody who plays Battlefield seriously would turn on RTX. If anything, people turn OFF as many effects as they can to improve visibility.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, CTR640 said:

IMHO AMD should have dropped HBM and redesigned GDDR5X/6 for Radeon VII to make it cheaper. 2+ years old performance for $700 or more is not something I would pay for it. I mean, I already bought 1080Ti for €680 back in september last year. Hopefully Intel can develop better GPU's to tackle nVidia as AMD is not doing very well in the high-end segment for consumers. Or AMD should lower it to $499 to make it much more attractive.

It's a re-purposed Compute card that's going to live a long life at Science Departments the world over and on the shelves of Tech Reviewers. We'll get an actual high-end card probably next year.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, i_build_nanosuits said:

how many reviews have you watched?

This card is 700$...consumes more power than a 2080ti...in fact more than any nvidia cards...doesn't have ray tracing, no tensor core, no DLSS...nowhere near the same level of suport and optimization from game developers etc. you'd have to be NUTS...and i mean it...completely NUTS to blow 700$+ on this crap of a card instead of an RTX 2080.

Most 2080s are more expensive than 700 dollars. Some models go as high as 900 USD.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, maartendc said:

I mean, that is like saying, it doesn't have PhysX or HairWorks, or whatever. Who cares? Just more Nvidia marketing hype things that will not get supported by games and die eventually.

 

stoped reading there cause you obviously have no idea what you're talking about ;)

have a good day!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, i_build_nanosuits said:

how many reviews have you watched?

This card is 700$...consumes more power than a 2080ti...in fact more than any nvidia cards...doesn't have ray tracing, no tensor core, no DLSS...nowhere near the same level of suport and optimization from game developers etc. you'd have to be NUTS...and i mean it...completely NUTS to blow 700$+ on this crap of a card instead of an RTX 2080.

not to mention heat/noise

8 minutes ago, Taf the Ghost said:

2080 Ti draws more power and no one cares about RTX.

for now, I do care because I want to go forward not stagnate

Link to comment
Share on other sites

Link to post
Share on other sites

At this rate I'll be putting my money towards a 2080 instead. Give me a few more months of saving and that's what I'll be getting.

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, CTR640 said:

IMHO AMD should have dropped HBM and redesigned GDDR5X/6 for Radeon VII to make it cheaper. 2+ years old performance for $700 or more is not something I would pay for it. I mean, I already bought 1080Ti for €680 back in september last year. Hopefully Intel can develop better GPU's to tackle nVidia as AMD is not doing very well in the high-end segment for consumers. Or AMD should lower it to $499 to make it much more attractive.

 

nVidia is just cancer with their RTX and Tensor cores. Why make the RTX cards damn expensive if they want to make ray tracing more common?

Hey, look at it this way: Your 1080Ti was the best investment ever. Because it still would be great value even today. (kind of sad, but good for you!)

 

Heck, I feel the same way about my 980Ti, which I bought used for $350 about 2 years ago. At that price, it is still only 12% slower than a RTX 2060 (which is $349). Still does everything I could need it to do at 1440P/60.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, maartendc said:

I mean, that is like saying, it doesn't have PhysX or HairWorks, or Ansel, or whatever. Who cares? Just more Nvidia marketing hype things that will not get supported by games and die eventually.

 

The only game I currently play is Battlefield V, and it is 8% faster in that title at the same price (and Battlefield V is an Nvidia "sponsored" title, mind you!). So I would buy the Radeon VII if I was in the market for a $700 card. And no, nobody who plays Battlefield seriously would turn on RTX. If anything, people turn OFF as many effects as they can to improve visibility.

no ray tracing will not die off

its not proprietary

you cant leap forward if stepping stones arent in place

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, pas008 said:

no ray tracing will not die off

its not proprietary

you cant leap forward if stepping stones arent in place

Real time ray tracing is defenetly the future of gaming, there's not even a shadow of a doubt about it...ask any developer they will tell you.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, i_build_nanosuits said:

stoped reading there cause you obviously have no idea what you're talking about ;)

have a good day!

Well, that is one way to "win" a debate.

 

Good for you, you won.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×