Jump to content

Hard OCP Blind test of Vega vs 1080Ti

VanayadGaming
2 minutes ago, Misanthrope said:

It looks like their gamble with HBM2 is as premature as Fiji was with HBM and the increased prices and difficulties keep raising prices on their high end stuff. I hope it pays up for them in other areas like really fancy Ryzen + Vega APUs for example but the tech is clearly not ready for prime time and I suspect the main culprit behind the really high rumored price tags.

Nvidia's actions have pointed to an April release for a while. Vega FE made "Q2", but RX Vega is later. 4 months could easily be down to the HBM2 being the culprit.  Or pretty much what we all assume at this point.

 

I'm really SUPER CURIOUS about testing on 7700k vs Ryzen 7. There's some real wonky Nvidia driver issues that could produce some hilarious results.

1 minute ago, mr moose said:

I never buy the top end card, so if vega performs like a 1080 but is actually priced closer to a card I can afford, then I am much more likely to consider it. 

RX Vega 56 (?) is probably out in September, as that's AIB only. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Taf the Ghost said:

Well, Nvidia pretty much just rolled out the real version of the Titan X (first one) with the 1080 Ti. So, technically Nvidia was even dogging everyone on that.

Sorry I was only saying Nvidia release impressive products. I never meant to imply they aren't dicks because they are. Titan XP, 1080Ti and TitanXp scenario is disgusting.

 

2 minutes ago, Taf the Ghost said:

I saw someone mention something that could actually be the "party piece" here. We assume a linear-ish relationship between 1080p, 1440p & 4K gaming. What if some of the optimizations makes 4K run at a lot lower differential compared to 1080p? Considering the way the Fury X lined up against the 1080 on launch (did much better comparatively at 4K than a 1080p), we could be in for something wacky.

Doubt it. We've seen it at 4K in sniper elite (An AMD favouring game)

Link to comment
Share on other sites

Link to post
Share on other sites

Interesting find on Reddit, but I don't know if it's true or not.

Quote

unit313: When AMD said there was a $300 price difference between the Vega & 1080 blind tests... I totally misunderstood.

 

ObviouslyTriggered: They've selected monitors which are $600 apart in their blind test ;)

 

ASUS Designo Curved MX34VQ 100hz FreeSync

ASUS PG348Q 100hz Gsync

 

Nvidia monitor - ~1300 USD

AMD monitor - ~700 USD

 

If this is the case, then when AMD says that there is a 300 dollar price difference, they are actually saying the Vega RX card is 300 dollars more expensive than the 1080 Ti.

 

Again, I refuse to believe that AMD would be this stupid.

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, Misanthrope said:

Hey if AMD is truly going with "Buy Vega because you can still save 300 bucks on your G-Sync monitor" then RX Vega will be fucking stillborn.

 

It's a decent argument don't get me wrong but most people are not gonna care: If you can afford 700 for a GPU you are almost always the kind of guy that also can afford a higher end monitor to go with and those are mostly similarly priced: the G-Sync costs gets eaten up fairly quickly on higher end panels.

Meanwhile I'm sitting here with a watercooled 1080ti that hits 2.0GHz with amazing thermals and no problems, using three 1080p monitors locked at 60fps lol

 

(waiting for monitor prices to come down. Also waiting to see 4k, 2k, and 3440x1440 first hand before buying any of them)

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

@tom_w141

 

Not saying they will, but AMD aren't stupid. Even if their Marketing from RTG can be wonky as all heck at times. Nvidia cut the 1080 price to a point to make sure AMD couldn't retail at $599 and just under cut them.

 

Still, they're going to have some angle that they're better than Nvidia at. I just don't know what it is.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Taf the Ghost said:

There's some real wonky Nvidia driver issues that could produce some hilarious results.

Tell me about it. When I see a game has DX12 mode I'm just like "no" selects DX11 mode because Nvidia can't make a DX12 driver that doesn't halve my fps. Seriously its a joke that using the newer API cripples performance. I love my 1080Ti but Nvidia need to sort their shit out when it comes to drivers.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, LAwLz said:

Again, I refuse to believe that AMD would be this stupid.

 

Well they are doing a blind test with 10 people on a look only tour before launch.   

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Taf the Ghost said:

@tom_w141

 

Not saying they will, but AMD aren't stupid. Even if their Marketing from RTG can be wonky as all heck at times. Nvidia cut the 1080 price to a point to make sure AMD could retail at $599 and just under cut them.

 

Still, they're going to have some angle that they're better than Nvidia at. I just don't know what it is.

Think of their stupidity in this area as the balance of their currently very successful CPU products. 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, tom_w141 said:

Tell me about it. When I see a game has DX12 mode I'm just like "no" selects DX11 mode because Nvidia can't make a DX12 driver that doesn't halve my fps. Seriously its a joke that using the newer API cripples performance. I love my 1080Ti but Nvidia need to sort their shit out when it comes to drivers.

The Ring Bus + Inclusive L3 Cache. It's why getting 7700k performance takes work on Skylake-X. Nvidia looked at the 2012 and beyond CPU market and made their driver abuse Sandy Bridge to its fullest. Good on their Driver Developers, but they clearly have problems going beyond that.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Misanthrope said:

Think of their stupidity in this area as the balance of their currently very successful CPU products. 

Maybe when they cordoned off RTG, part of it was they sent the stupid marketing people with them.

 

Still, I'm going to predict $599 USD for the Air-cooled and $749 USD for the water cooled. Unless the Rapid Packed Math does something insane to 4K gaming, that's about the price we'd be expecting.

 

For the RX Vega 56 versions, I'm thinking $450 is the target.

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Taf the Ghost said:

The Ring Bus + Inclusive L3 Cache. It's why getting 7700k performance takes work on Skylake-X. Nvidia looked at the 2012 and beyond CPU market and made their driver abuse Sandy Bridge to its fullest. Good on their Driver Developers, but they clearly have problems going beyond that.

I doubt that's the reason for poor DX12 performance.

I mean, AMD is not exactly doing that well in DX12 either (in before "you can't count title X, Y, Z, A and B because I don't deem them 'true' DX12 titles! Only this handful of cherry picked games counts!").

Since both AMD and Nvidia have issues (in varying degrees), I think it's pretty fair to say that a big part of the problem is just down to the maturity of DX12 as a whole. Tools and developers will probably get a lot better in the coming years which might fix a lot of issues without AMD and Nvidia needing to do anything on their ends (which they probably will anyway).

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, LAwLz said:

I doubt that's the reason for poor DX12 performance.

I mean, AMD is not exactly doing that well in DX12 either (in before "you can't count title X, Y, Z, A and B because I don't deem them 'true' DX12 titles! Only this handful of cherry picked games counts!").

Since both AMD and Nvidia have issues (in varying degrees), I think it's pretty fair to say that a big part of the problem is just down to the maturity of DX12 as a whole. Tools and developers will probably get a lot better in the coming years which might fix a lot of issues without AMD and Nvidia needing to do anything on their ends (which they probably will anyway).

I was actually going to say "id makes everyone in the Gaming space look stupid", but haha.

 

Actually, what we're really seeing is a classic issue with computers. Theoretical Output is rarely ever met. It takes specific optimizations of code & systems to get near 100% out of equipment. It's the reason you see actual Regressions in performances with new versions of software or new approaches. Some old programs outright run better on slower OSes or Hardware because their especially tuned for those situations.

 

The other thing is that DX12 tends to favor AMD a bit more than Nvidia due to some of the functions getting a lot more of the theoretical potential out of AMD's uArch. The same time that produced Nvidia's impressive DX11 drivers aren't suddenly incompetent once DX12 landed. Clearly some combination of the uArch and necessities cause problems. (Though id can still get a massive amount out of it, haha.)

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, tom_w141 said:

Blind testing is bad news. When a company blind tests instead of revealing impressive numbers it means the product is inferior. They are basically saying its not as good but hey it feels the same! This is because at higher frame rates for example, we can't notice the difference between 120 and 150 fps.

 

Also the fact that they keep stressing vega + freesync is cheaper than NVidia +gsync just to me says they know they aren't competitive on price for the gpus and are relying on the large premium carried by gsync panels. Well that's crap logic because for a start someone upgrading their gpu might already have a decent freesync monitor and therefore it would be better for them to get a 1080Ti and just not use the freesync functionality,

120 and 150? It’s pretty easy to tell between 120 and 144

Link to comment
Share on other sites

Link to post
Share on other sites

This seems like an attempt to invalidate faster hardware by locking it to a specific refresh window (variable refresh window) then invalidate the framerate fluctuation by using variable refresh to smooth it all out. Freesync and G-Sync are basically designed to mask inconsistent framerates, and it makes it difficult to get an accurate representation of which product is actually superior. Sure, the experience might be similar under these conditions (and to most, this will matter) but it doesn't really give us an accurate depiction of what these two pieces of hardware are capable of, relative to each other.

 

This is all barring how blind studies tend to require large sample sizes for improved accuracy, and 10 is not exactly large. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, tom_w141 said:

Blind testing is bad news. When a company blind tests instead of revealing impressive numbers it means the product is inferior. They are basically saying its not as good but hey it feels the same! This is because at higher frame rates for example, we can't notice the difference between 120 and 150 fps.

 

Also the fact that they keep stressing vega + freesync is cheaper than NVidia +gsync just to me says they know they aren't competitive on price for the gpus and are relying on the large premium carried by gsync panels. Well that's crap logic because for a start someone upgrading their gpu might already have a decent freesync monitor and therefore it would be better for them to get a 1080Ti and just not use the freesync functionality,

I completely get what you are saying. I personally doubt the high end Vega will have a higher max fps than 1080ti. However, we're seeing benchmarkers move away from min/avg/max fps over to avg, lowest 1%, lowest 0.1% for a reason. High fps and high average fps is less important if you have stutter, huge dips and such. In that regard, I think the new memory controller in VEGA is going to be quite revolutionary. Because the card can pull resources from a lot of different sources, we might see it get all the resources it needs without using the CPU as a middleman. This could result in a massively higher lowest 1% and 0.1% performance than other cards (not just the controller but HBM2 as well).

 

Of course, that is speculation, but reading about the controller and seeing the type of testing AMD are doing, it would make sense.

4 hours ago, LAwLz said:

I doubt that's the reason for poor DX12 performance.

I mean, AMD is not exactly doing that well in DX12 either (in before "you can't count title X, Y, Z, A and B because I don't deem them 'true' DX12 titles! Only this handful of cherry picked games counts!").

Since both AMD and Nvidia have issues (in varying degrees), I think it's pretty fair to say that a big part of the problem is just down to the maturity of DX12 as a whole. Tools and developers will probably get a lot better in the coming years which might fix a lot of issues without AMD and Nvidia needing to do anything on their ends (which they probably will anyway).

Yeah, DX12 is very hit and miss. It seems to only favour one vendor and punish the other. It seems to correlate well to which vendor has helped the dev with their DX12 implementation. In the long run, that is really not a good thing. Then again, you should just use the API that yields the best result on your card regardless.

 

And then you have DOOM. I still don't get why the Dishonored 2 devs used the Rage engine and ported it to DX11, rather than using the Doom engine and use Vulkan. Dishonored 2 is so damn good but suffers a lot from the shit engine.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

One system costs 300$ extra 

Given what we know about vega pricing, I think we might have taken that the wrong way :/

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

These "I can't see the difference in this particular scenario"  tests make no sense.

I'm afraid AMD knows Vega won't compete with the 1080Ti and I don't understand why the disabled features in Vega FE aren't helping RX versions.

On a mote of dust, suspended in a sunbeam

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, GatioH said:

120 and 150? It’s pretty easy to tell between 120 and 144

No it isn't. There are specific scenarios where you could, but if you're staring at a FPS counter and can "feel" a difference it is just your braining tricking you into thinking you can. The higher your FPS the more you need to tell any real difference. 30 to 60 is a huge jump. 60 to 90 is much smaller even though it is the same increase, but going up to 120 from 60 produces another leap. Going from 120 to 180 is an even smaller jump than 60 to 90. 120 to 240 is smaller than 60 to 120. At some point the monitor is refreshing so fast that our brains simply cannot keep up with it and we lose the ability to truly feel the differences.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Notional said:

And then you have DOOM. I still don't get why the Dishonored 2 devs used the Rage engine and ported it to DX11, rather than using the Doom engine and use Vulkan. Dishonored 2 is so damn good but suffers a lot from the shit engine.

They were probably working on it since Dishonored 1 came out.

By the time Doom Vulkan came out I doubt they wanted to refactor everything. Timing not right...

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, Derangel said:

No it isn't. There are specific scenarios where you could, but if you're staring at a FPS counter and can "feel" a difference it is just your braining tricking you into thinking you can. The higher your FPS the more you need to tell any real difference. 30 to 60 is a huge jump. 60 to 90 is much smaller even though it is the same increase, but going up to 120 from 60 produces another leap. Going from 120 to 180 is an even smaller jump than 60 to 90. 120 to 240 is smaller than 60 to 120. At some point the monitor is refreshing so fast that our brains simply cannot keep up with it and we lose the ability to truly feel the differences.

Yeah it is. You find text somewhere and stand a couple feet away from it, as you’re walking left and right it’s a night and day difference (even without an FPS counter)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Agost said:

These "I can't see the difference in this particular scenario"  tests make no sense.

I'm afraid AMD knows Vega won't compete with the 1080Ti and I don't understand why the disabled features in Vega FE aren't helping RX versions.

I keep seeing this, but I never really hear what disabled features could possibly aid in Vega's (consumer) performance. What features were disabled, and what impact did you expect it to have?

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MageTank said:

I keep seeing this, but I never really hear what disabled features could possibly aid in Vega's (consumer) performance. What features were disabled, and what impact did you expect it to have?

We really have no idea what the difference is between Vega FE and Rx Vega.

Because AMD refuses to tell us until the launch of rx Vega.

 

From Raja's AMA

RA2lover: What things does the RX Vega have over the Radeon Vega FE that would make it worth the extra wait?

Raja Koduri: RX will be fully optimized gaming drivers, as well as a few other goodies that I can’t tell you about just yet….

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, MageTank said:

I keep seeing this, but I never really hear what disabled features could possibly aid in Vega's (consumer) performance. What features were disabled, and what impact did you expect it to have?

I thought the tile based rasterization was one thing but I could be wrong.

[Out-of-date] Want to learn how to make your own custom Windows 10 image?

 

Desktop: AMD R9 3900X | ASUS ROG Strix X570-F | Radeon RX 5700 XT | EVGA GTX 1080 SC | 32GB Trident Z Neo 3600MHz | 1TB 970 EVO | 256GB 840 EVO | 960GB Corsair Force LE | EVGA G2 850W | Phanteks P400S

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Server 01: Intel Xeon D 1541 | ASRock Rack D1541D4I-2L2T | 32GB Hynix ECC DDR4 | 4x8TB Western Digital HDDs | 32TB Raw 16TB Usable

Server 02: Intel i7 7700K | Gigabye Z170N Gaming5 | 16GB Trident Z 3200MHz

Link to comment
Share on other sites

Link to post
Share on other sites

If the Air cooled card is over £450 it's dead on arrival... and they should not be charging an extra £150 of the water cooled version, when the cooler they put on it is a £50 cooler at most.

 

I want Vega to succeed but going of the prices that have been banded around for the rumored performance, its got no chance.

----Ryzen R9 5900X----X570 Aorus elite----Vetroo V5----240GB Kingston HyperX 3k----Samsung 250GB EVO840----512GB Kingston Nvme----3TB Seagate----4TB Western Digital Green----8TB Seagate----32GB Patriot Viper 4 3200Mhz CL 16 ----Power Color Red dragon 5700XT----Fractal Design R4 Black Pearl ----Corsair RM850w----

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

This was my reaction.  How bad is this fucking thing that they don't bother with benchmarks but instead go straight to some bullshit infomercial tactic of blind testing.  Where's the "just wait there's more...call now and we'll throw in a free hybrid cooler!".

 

Or are they just trolling us right now and it's somehow going to be good they just want us to think it's a steaming stillborn fetus piece of shit.

Workstation:  13700k @ 5.5Ghz || Gigabyte Z790 Ultra || MSI Gaming Trio 4090 Shunt || TeamGroup DDR5-7800 @ 7000 || Corsair AX1500i@240V || whole-house loop.

LANRig/GuestGamingBox: 9900nonK || Gigabyte Z390 Master || ASUS TUF 3090 650W shunt || Corsair SF600 || CPU+GPU watercooled 280 rad pull only || whole-house loop.

Server Router (Untangle): 13600k @ Stock || ASRock Z690 ITX || All 10Gbe || 2x8GB 3200 || PicoPSU 150W 24pin + AX1200i on CPU|| whole-house loop

Server Compute/Storage: 10850K @ 5.1Ghz || Gigabyte Z490 Ultra || EVGA FTW3 3090 1000W || LSI 9280i-24 port || 4TB Samsung 860 Evo, 5x10TB Seagate Enterprise Raid 6, 4x8TB Seagate Archive Backup ||  whole-house loop.

Laptop: HP Elitebook 840 G8 (Intel 1185G7) + 3080Ti Thunderbolt Dock, Razer Blade Stealth 13" 2017 (Intel 8550U)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×