Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Ekin

[Updated with final Review] Vega FE Benchmarking by PCPER

Recommended Posts

Why does every bad Amd Benchmark bring out the "it is only due to drivers " People?

 

SO the great Nvidia benches should get even better ratings due to "Early drivers" as well? Why all the super defensive acting? 

 

They die worse the expected. End of Story.

Link to post
Share on other sites
1 minute ago, Rattenmann said:

Why does every bad Amd Benchmark bring out the "it is only due to drivers " People?

 

SO the great Nvidia benches should get even better ratings due to "Early drivers" as well? Why all the super defensive acting? 

 

They die worse the expected. End of Story.

Well, mostly because people post baiting comments like you just did on a subject that's completely unclear at the moment, yet they proclaim the cards that don't exist yet are a failure.  When no one has them to test with drivers that don't publicly exist yet.

 

Get the point?

Link to post
Share on other sites
12 minutes ago, zMeul said:

Vega doesn't seem to have tile based raster

example

 

 

Not like you to be late to the party on information. :D
 

 


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
5 minutes ago, Rattenmann said:

Why does every bad Amd Benchmark bring out the "it is only due to drivers " People?

 

SO the great Nvidia benches should get even better ratings due to "Early drivers" as well? Why all the super defensive acting? 

 

They die worse the expected. End of Story.

maybe because there is differences due to the drivers, also because amd themselves have stated that this driver is not representative of rx vega perf, 

Link to post
Share on other sites
43 minutes ago, MoonSpot said:

I wouldn't hold my breath.

It's just such a cluster fuck right now that the results aren't entirely valid since we lack visibility since proper reviews aren't being published yet. For instance we should expect like 10% better performance just by changing the fab curve and having it not thermal throttling. There is the absence of tile based rendering which us significant for gaming, as it is one of the main feature allowing maxwell to be that good un gaming tasks. It's not there, it could be massive but we don't know if it's driver related it if amd made an architectural change on that. That's a lot of If etc. 

Link to post
Share on other sites

If Fallout 4 1440p retest result is the same like the old one then i guess Vega have the potential to be on par with 1080 Ti with proper driver and cooling, If not then meh..


| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to post
Share on other sites
2 minutes ago, xAcid9 said:

If Fallout 4 1440p retest result is the same like the old one then i guess Vega have the potential to be on par with 1080 Ti with proper driver and cooling, If not then meh..

 

Who knows, as the moment it's closer to overclocked Fury X in games, and hell major performance features like Tiled rasterization aren't even working on the Frontier Edition.
 

This is such a weird ass card, and "launch".


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
7 hours ago, Hunter259 said:

It's a prosumer card with a Gaming mode. It's a titan for AMD. The have actual pro cards on the way. Quadro's aren't "meant" for gaming but run just as well as their GTX counterparts. This is such a weak argument for this card. You might get 10-15% on drivers but this still is such a disappointing card.

I'd take it even further. Either:

 

1) The card will be weak or

2) AMD failed or lied about releasing more mature drivers this time around.

 

Pick your poison AMD Enthusiasts.


-------

Current Rig

-------

Link to post
Share on other sites
5 hours ago, zMeul said:

you surely understand / know that the GPUs inside the Quadro cards are identical to those in GTX cards? the only diff is that Quadro cards have more VRAM and it's ECC

it baffles me when people say this is a workstation card and the RX will be different - it won't! that's not how it works, it's the same architecture ... the same

That's not quite true but for this discussion it mostly is. There are Quadro cards that use the GP100 die (GV100 soon) but there are no Geforce cards that use the GP100 die, those all use GP102 or less.

 

The GP100 die has a completely different CUDA core and Floating Point architecture structure, the difference is painfully obvious in FP16 performance.

 

Quote

On the compute side, Pascal introduces a new type of FP32 CUDA core that supports a form of FP16 execution where two FP16 operations are run through the CUDA core at once (vec2). This core, which for clarity I’m going to call an FP16x2 core, allows the GPU to process 1 FP32 or 2 FP16 operations per clock cycle, essentially doubling FP16 performance relative to an identically configured Maxwell or Kepler GPU.

 

Quote

Now there are several special cases here due to the use of vec2 – packing together operations is not the same as having native FP16 CUDA cores – but in a nutshell NVIDIA can pack together FP16 operations as long as they’re the same operation, e.g. both FP16s are undergoing addition, multiplication, etc. Fused multiply-add (FMA/MADD) is also a supported operation here, which is important for how frequently it is used and is necessary to extract the maximum throughput out of the CUDA cores.

 

Quote

GeForce GTX 1080, on the other hand, is not faster at FP16. In fact it’s downright slow. For their consumer cards, NVIDIA has severely limited FP16 CUDA performance. GTX 1080’s FP16 instruction rate is 1/128th its FP32 instruction rate, or after you factor in vec2 packing, the resulting theoretical performance (in FLOPs) is 1/64th the FP32 rate, or about 138 GFLOPs.

 

Quote

As it turns out, when it comes to FP16 NVIDIA has made another significant divergence between the HPC-focused GP100, and the consumer-focused GP104. On GP100, these FP16x2 cores are used throughout the GPU as both the GPU’s primarily FP32 core and primary FP16 core. However on GP104, NVIDIA has retained the old FP32 cores. The FP32 core count as we know it is for these pure FP32 cores. What isn’t seen in NVIDIA’s published core counts is that the company has built in the FP16x2 cores separately.

 

Quote
       FP16                                FP64
GP104
     1:64
1:32
GP100
     2:1
1:2
GM200
N/A (Promoted to FP32)
1:32
GK110
N/A (Promoted to FP32)
1:3
GK104     
N/A (Promoted to FP32)
1:24

http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5

 

Also note GP102 has the same limitations i.e. 1080Ti and Titan X/Xp.

 

Why is this important you may ask? Well The entire Vega architecture has native INT8, FP16, FP32 and 1:16 FP64 which is extremely important for researchers and other professional application use cases, you're getting $6000+ levels of performance for $1000 (ish). Yea none of this means a damn thing for people wanting to play games, but anyone in the know in the scientific and academic community will be following Vega with heightened interest.

Link to post
Share on other sites
1 minute ago, Misanthrope said:

I'd take it even further. Either:

 

1) The card will be weak or

2) AMD failed or lied about releasing more mature drivers this time around.

 

Pick your poison AMD Enthusiasts.

The problem with AMD lying is that they did so directly to investors during earnings calls and shareholder meetings. That'll get them in big legal trouble,

Although seeing major performance features like tiled rasterization not even working it looks like  major firmware and driver issues.

 

Still so many unknowns. We've not even seen a Frontier Edition tear down, or even an architectural break down in detail like with other GPU launches.


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
32 minutes ago, yian88 said:

No people just want to get their answer faster already, "how good is the new AMD Vega arch", so they test a workhorse gpu like FE in games, it probably doesnt even have game drivers or optimizations.

This will create the impression AMD Vega sucks, since every tech illeterate out there will see these benchmarks and without informing themselves any further they think it AMD sucks.

It creates bad rep for AMD, because many people are uninformed.

There's no way AMD would have released such an underperforming gaming card for 1000$, no one would buy it when 1080ti/titan are miles ahead and cheaper.

 

Ive watched PCper video, 0 professional apps were tested, a bunch of losers, they should know better.

 

Of course no professional apps were tested, his viewers want to know how it runs in games.  

 

So the answer is the later, you are not happy that people are considering the FE as a gaming solution.

 

Also you should check out what the 3rd person effect is, because I think you are suffering from it.  Mass media (or in this case youtubers) do not have the great effect on consumers you think they do.  We are not ignorant consumers that will avoid AMD just because a gaming hardware reviewer didn't run professional app benchmarks.  We want to know how well it runs games just like we wanted to know how the titan ran games.  Nothing new here.  I can't believe you are scared people won't look at the rx when it comes out becasue the FE didn't do well.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Sometimes I miss contractions like n't on the end of words like wouldn't, couldn't and shouldn't.    Please don't be a dick,  make allowances when reading my posts.

Link to post
Share on other sites

Guys, I don't think "gaming" drivers will really help.... 

As you can see in the video the Quadro M6000 (which doesn't even have a gaming mode) performed just like a Titan X in games, which was using GeForce drivers. 

 

I believe that after driver optimizations, Vega may be able to match the 1080, but it will definitely not be able to match a 1080 Ti.


CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro Wireless (main), Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to post
Share on other sites
32 minutes ago, Taf the Ghost said:

Well, mostly because people post baiting comments like you just did on a subject that's completely unclear at the moment, yet they proclaim the cards that don't exist yet are a failure.  When no one has them to test with drivers that don't publicly exist yet.

 

Get the point?

Partially, but the fact that everyone's been waiting quite a while for AMD to release a beast plays a part as well. Nvidia has topped itself about 4 times on the high end level and in the same time AMD has only released some seriously mid level cards. People who've been patiently awaiting a high end AMD card are pretty invested at this point, so many are getting defensive on that end as well.


- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to post
Share on other sites
6 minutes ago, PCGuy_5960 said:

Guys, I don't think "gaming" drivers will really help.... 

s you can see in the video the Quadro M6000 (which doesn't even have a gaming mode) performed identically to the Titan X, which was using GeForce drivers. 

 

I believe that after driver optimizations, Vega may be able to match the 1080, but it will definitely not be able to match a 1080 Ti.

Issues are more than drivers, tiled rasterization is not even working. That was part of the secret sauce for Maxwell over Kepler on the same 28nm node.

 

AMD's slides for investors and public still show increased clocks and IPC as well. We're not seeing any of that; bar clocks.

Dunno what the hell is going on with Frontier Edition; maybe it's a combination of drivers, and firmware that's screwed the pooch; or the FE production stepping is a mess.

Videocardz mentioned a while back that the major delay is because of a major hardware issue during production; and they needed a new stepping to rectify.

So maybe FE was launched just to meet the Investor deadline for H1 2017 for Vega :/

Either way, this entire launch is still perplexing; looking at the hardware specs alone the performance makes absolutely no sense. Overclocked FuryX and 980Ti; while 50% smaller node, over 50% more TFLOPs, 50% more clocks, 100% more VRAM, and higher TDP; along with supposed massive features like tiled rasterization, discard accelerators and more.


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
4 minutes ago, Valentyn said:

Dunno what the hell is going on with Frontier Edition; maybe it's a combination of drivers, and firmware that's screwed the pooch; or the FE production stepping is a mess.

Even if it gains 10-15% performance in games it will be on par with a 1080 FE.

 

So in the best case scenario Vega is on par with a 1080 in games, not bad, but only if they get the pricing right.


CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro Wireless (main), Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to post
Share on other sites

But WHY is tiled rasterization off?

 

Were they lying or something strange has happened within the drivers?


On a mote of dust, suspended in a sunbeam

Link to post
Share on other sites
1 minute ago, PCGuy_5960 said:

Even if it gains 10-15% performance in games it will be on par with a 1080 FE.

 

So in the best case scenario Vega is on par with a 1080 in games, not bad, but only if they get the pricing right.

Aye, but as mentioned, and even you can see it. The performance for those specs don't make sense.

Hell Frontier Edition is around 20fps slower than in the old Vega DOOM demo. The hell is up with that.


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
4 hours ago, Red Hardware said:

Hi Guys

Something that most of you are not paying attention to is that it's not a gaming card. It's made for work station users. And I think there is something wrong with the drivers , All of that vega is slightly better than a fury x ? I think it doesn't do well in games because of the drivers. We should wait for RX VEGA and see how it goes but it's too early to judge vega base on some gaming benchmarks of a work station card

 

Name one person that said that very thing about any of the Titan X cards when they came out... They're both "workstation" cards and the Titan X has (for all that matters) Identical performance to the GTX 1080Ti. Will RX Vega not have the same exact chip in it as the FE?

Link to post
Share on other sites
4 minutes ago, Valentyn said:

Aye, but as mentioned, and even you can see it. The performance for those specs don't make sense.

Hell Frontier Edition is around 20fps slower than in the old Vega DOOM demo. The hell is up with that.

In that Doom demo, Vega was on par with a 1080 ;) 

 

Don't get your hopes up, Vega won't be able to match the 1080 Ti, unless it somehow magically gets a 50+% performance boost.:P I believe that it will be as good as a 1080 in DX11 and slightly better than a 1080 when using DX12/Vulkan


CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro Wireless (main), Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to post
Share on other sites
2 minutes ago, PCGuy_5960 said:

In that Doom demo, Vega was on par with a 1080 ;) 

 

Don't get your hopes up, Vega won't be able to match the 1080 Ti, unless it somehow magically gets a 50+% performance boost.:P I believe that it will be as good as a 1080 and slightly better than a 1080 when using DX12/Vulkan

On par with a 2Ghz overclocked GTX 1080, while running on Fury drivers with a debugging layer as confirmed by AMD.

 

Who said about matching a 1080Ti? I want the performance the specs should deliver, and that's around 10% over 1080, 10-15% below Ti.

 

Hell in Vulkan in DOOM being on par with the 1080 is not good at all, AMD usually holds an advantage there.


5820K 4.0GHz | NH D15S | 32 GB RAM | GTX 580 | ASUS PG348Q+MG278Q

 

Link to post
Share on other sites
Just now, Valentyn said:

On par with a 2Ghz overclocked GTX 1080, while running on Fury drivers with a debugging layer as confirmed by AMD.

Nope, on par with a stock 1080 when using VULKAN (And AMD cards get a 15-30% performance boost when using Vulkan)


CPU: Intel Core i7-5820K | Motherboard: AsRock X99 Extreme4 | Graphics Card: Gigabyte GTX 1080 G1 Gaming | RAM: 16GB G.Skill Ripjaws4 2133MHz | Storage: 1 x Samsung 860 EVO 1TB | 1 x WD Green 2TB | 1 x WD Blue 500GB | PSU: Corsair RM750x | Case: Phanteks Enthoo Pro (White) | Cooling: Arctic Freezer i32

 

Mice: Logitech G Pro Wireless (main), Razer Viper Ultimate, Zowie S1 Divina Blue, Zowie FK1-B Divina Blue, Logitech G Pro (3366 sensor), Glorious Model O, Razer Viper Mini, Logitech G305, Logitech G502, Logitech G402

Link to post
Share on other sites

What I don't understand is why do AMD users wait THIS long for a damn gfx card? Just mow your grandma's lawn for a month or so and get a 1080. 

 

I mean, what did they expect? Vega to come out far ahead of the 1080?  

 

Time is gold, and wasting it for waiting for "new stuff" is just silly imo.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Newegg

×