Jump to content

Regarding the RX 480 AOTS benchmark at Computex

Fulgrim
2 minutes ago, Demonking said:

because AMD makes their own CPU's which compete with intel, so by using their competition product instead of their own to test another of their own product that just shows they have no faith in said product, but expect people to buy it. how is this so hard to understand.  

It's not a matter of faith. It's a matter of their FX chips being inferior to the current high-end segment of the desktop CPU market. Denying it is plain stupidity. And so is using faith as a reasoning for them to use FX chips in benchmarks.

Shot through the heart and you're to blame, 30fps and i'll pirate your game - Bon Jovi

Take me down to the console city where the games are blurry and the frames are thirty - Guns N' Roses

Arguing with religious people is like explaining to your mother that online games can't be paused...

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Demonking said:

because AMD makes their own CPU's which compete with intel, so by using their competition product instead of their own to test another of their own product that just shows they have no faith in said product, but expect people to buy it. how is this so hard to understand.  

We've all understood it, told you why it is so, but you just keep on digging. I'd say this is more embarrassing than AMD throwing Bulldozer under the bus. AMD is watching this exchange and feeling fremdscham. I guarantee it.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, That Norwegian Guy said:

AoTS is the only natively developed DX12 title. Hitman, QB and TW: Warhammer included DX12 support mid-development and aren't developed ground up to use it.

What is relevant is the implementation not the when. The argument can be made against Hitman, that async compute is basically doing nothing in that game, but not Total War or Quantum Break. Quantum Break uses it for their lighting and I don't need to tell you what it's used for in Warhammer. Maybe you want the info from the horse's mouth?

 

 

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, DarkBlade2117 said:

So you can have a mobile oven in your backpack? Razers idea of cooling is no better than Apple's idea of cooling.

 

You think I want to carry around a 17in 7-9lb laptop? Of course not, ill take the heat for the portability 

My Rig:  CPU: Core i7 4790K @4.8ghz  Motherboard: Asus Maximus Vii Hero  Ram: 4x4GB Corsair Vengeance Pro 2400mhz (Red)  Cooling: Corsair H105, 2x Corsair SP120 High Preformance Editions, Corsair AF 140 Quiet Edition  PSU: Corsair RM 850  GPU: EVGA GTX 980 SC ACX 2.0  Storage: Samsung 840 EVO 120GB, WD Blue 1TB  Case Corsair 760t (Black)  Keyboard: Razer Blackwidow Chroma  Mouse: Razer Deathadder Chroma  Headset: ATH-M50X Mic: Blue Yeti Blackout

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well mainly all for demo and all. Expect ton of benchmarks soon anyway.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, TidaLWaveZ said:

Actually...

 

The reference card is $450

Exactly my point.

We're talking about reference prices. We dont know what AIB partners will do for the 480, for all you know there could be a 170$ ultra basic 480.

1 hour ago, Demonking said:

because AMD makes their own CPU's which compete with intel, so by using their competition product instead of their own to test another of their own product that just shows they have no faith in said product, but expect people to buy it. how is this so hard to understand.  

Actually for benchmarking purpose, it's probably more reliable to use a competitor's CPU instead of your own. Nobody can create conspiracy theories about AMD creating voodoo magic using a CPU chip at 8 ghz or some shit

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Demonking said:

I'm going to leave it at this before an admin gets on us. If i was the engineer behind FX i would be embarrassed by this. If i was AMD  would still use fx to this day to show i had faith in my product from 4 years ago showing it still can compete.

Come on....

If you are showing your new GPU then it makes sense to use the best available CPU so that you can show full potential of your new product.

And you want to make damn sure that your new product will appeal as best as it can to the people.

 

What good for it would be if you hindered the performance of your new GPU because you are so proud of your 5 year old CPU that is soon to be discontinued?

 

I am sure AMD would love to use their CPU for these tests but Zed is obviously not ready yet.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Demonking said:

I did if you look above  would know that. This doesn't change the fact that it is embarrassing.

I think they'd rather be embarrassed than be seen as stubborn.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, ANewFace said:

You think I want to carry around a 17in 7-9lb laptop? Of course not, ill take the heat for the portability 

You know Dell, HP, Gigabyte, MSI ALL have notebooks that weigh less then 5 pounds less. Why get a 1070m or what not when it'll thermal throttle in the Blade? I rather not have my laptop sit at 90C+

The blade is not a good laptop, especially for the price. Someone can suggest a better one for a much cheaper price. I think it is HP specifically who had one.

 

 

i7-6700k  Cooling: Deepcool Captain 240EX White GPU: GTX 1080Ti EVGA FTW3 Mobo: AsRock Z170 Extreme4 Case: Phanteks P400s TG Special Black/White PSU: EVGA 850w GQ Ram: 64GB (3200Mhz 16x4 Corsair Vengeance RGB) Storage 1x 1TB Seagate Barracuda 240GBSandisk SSDPlus, 480GB OCZ Trion 150, 1TB Crucial NVMe
(Rest of Specs on Profile)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, RagnarokDel said:

Exactly my point.

We're talking about reference prices. We dont know what AIB partners will do for the 480, for all you know there could be a 170$ ultra basic 480.

Actually for benchmarking purpose, it's probably more reliable to use a competitor's CPU instead of your own. Nobody can create conspiracy theories about AMD creating voodoo magic using a CPU chip at 8 ghz or some shit

 

 

We're really talking about graphics card solutions and there's a little part of the AMD press conference where they announce that the RX 480 will be starting at $199. You could be right about third parties releasing a cheaper version, but I interpret what they said as the baseline is $199.

- ASUS X99 Deluxe - i7 5820k - Nvidia GTX 1080ti SLi - 4x4GB EVGA SSC 2800mhz DDR4 - Samsung SM951 500 - 2x Samsung 850 EVO 512 -

- EK Supremacy EVO CPU Block - EK FC 1080 GPU Blocks - EK XRES 100 DDC - EK Coolstream XE 360 - EK Coolstream XE 240 -

Link to comment
Share on other sites

Link to post
Share on other sites

I'm just gonna say this...

 

AotS was the worst mistake they could do. The market currently perceives it as an AMD biased benchmark, especially after the 390X vs Titan X fiasco.

 

They keep pulling these dumb-arse benchmarks that just paint them in a bad light. Why not use a benchmark considered less biased? Something to hide?

Spartan 1.0

Spoiler

CPU: Intel Core i7-4770K 3.5GHz Quad-Core Processor

CPU Cooler: Cooler Master Seidon 120XL 86.2 CFM Liquid CPU Cooler

Motherboard: Asus Maximus VI Extreme ATX LGA1150 Motherboard
Memory: Corsair Dominator 32GB (4 x 8GB) DDR3-1600 Memory
Storage: OCZ Vector Series 512GB 2.5" Solid State Drive
Storage: Seagate Desktop HDD 4TB 3.5" 7200RPM Internal Hard Drive

Video Card: EVGA GeForce GTX 980 4GB Classified ACX 2.0 Video Card
Case: Thermaltake Urban S41 ATX Mid Tower Case
Power Supply: Corsair 1200W 80+ Platinum Certified Fully-Modular ATX Power Supply
Optical Drive: LG BH16NS40 Blu-Ray/DVD/CD Writer
Optical Drive: LG BH10LS30 Blu-Ray/DVD/CD Writer
Operating System: Microsoft Windows 10 Pro 64-bit
Sound Card: Creative Labs ZXR 24-bit 192 KHz Sound Card
Monitor: 2x Asus VG278HE 27.0" 144Hz Monitor
Keyboard: Logitech G19s Wired Gaming Keyboard
Keyboard: Razer Orbweaver Elite Mechanical Gaming Keypad Wired Gaming Keyboard
Mouse: Logitech G700s Wireless Laser Mouse
Headphones: Creative Labs EVO ZxR 7.1 Channel  Headset
Speakers: Creative Labs GigaWorks T40 Series II 32W 2ch Speakers

Hades 1.0

Spoiler

Laptop: Dell Alienware 15 2015

CPU: i7-4720HQ CPU

Memory: 16GB DDR3 SODIMM RAM

Storage: 256GB M.2 SSD

Storage: 1TB 5400rpm 2.5" HDD

Screen: 15.6" FHD Display

Video Card: Nvidia GTX 970M with 3GB

Operating System: Windows 10 Pro

Project: Spartan 1.2 PLEASE SUPPORT ME NEW CHANNEL > Tech Inquisition

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GidonsClaw said:

I'm just gonna say this...

 

AotS was the worst mistake they could do. The market currently perceives it as an AMD biased benchmark, especially after the 390X vs Titan X fiasco.

 

They keep pulling these dumb-arse benchmarks that just paint them in a bad light. Why not use a benchmark considered less biased? Something to hide

It's biased due to nvidia's architecture. It is a great example of good dx12 games having gains on amd hardware. Which you will not get on nvidia's hardware. 

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, GidonsClaw said:

I'm just gonna say this...

 

AotS was the worst mistake they could do. The market currently perceives it as an AMD biased benchmark, especially after the 390X vs Titan X fiasco.

 

They keep pulling these dumb-arse benchmarks that just paint them in a bad light. Why not use a benchmark considered less biased? Something to hide?

I agree with you that the results are bias, but it's because AMD push it so hard because it is by far the most well developed DX12 game. It's engine can and has demoed pretty much all the benefits of DX12 & Async compute. If another game offered that level of depth, I'm sure they would demo that too. 

 

Nvidia is hamstrung on some of the bigger benefits of DX12 thanks to their architecture. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Belgarathian said:

I agree with you that the results are bias, but it's because AMD push it so hard because it is by far the most well developed DX12 game. It's engine can and has demoed pretty much all the benefits of DX12 & Async compute. If another game offered that level of depth, I'm sure they would demo that too. 

 

Nvidia is hamstrung on some of the bigger benefits of DX12 thanks to their architecture. 

I wouldn't purely call it an uArch issue, it's also driver interpretation of compute instructions to pass to the CUDA's... The 1080 drivers are very new, AMD has had a long time to sort out their stuff since Fury X released....

Spartan 1.0

Spoiler

CPU: Intel Core i7-4770K 3.5GHz Quad-Core Processor

CPU Cooler: Cooler Master Seidon 120XL 86.2 CFM Liquid CPU Cooler

Motherboard: Asus Maximus VI Extreme ATX LGA1150 Motherboard
Memory: Corsair Dominator 32GB (4 x 8GB) DDR3-1600 Memory
Storage: OCZ Vector Series 512GB 2.5" Solid State Drive
Storage: Seagate Desktop HDD 4TB 3.5" 7200RPM Internal Hard Drive

Video Card: EVGA GeForce GTX 980 4GB Classified ACX 2.0 Video Card
Case: Thermaltake Urban S41 ATX Mid Tower Case
Power Supply: Corsair 1200W 80+ Platinum Certified Fully-Modular ATX Power Supply
Optical Drive: LG BH16NS40 Blu-Ray/DVD/CD Writer
Optical Drive: LG BH10LS30 Blu-Ray/DVD/CD Writer
Operating System: Microsoft Windows 10 Pro 64-bit
Sound Card: Creative Labs ZXR 24-bit 192 KHz Sound Card
Monitor: 2x Asus VG278HE 27.0" 144Hz Monitor
Keyboard: Logitech G19s Wired Gaming Keyboard
Keyboard: Razer Orbweaver Elite Mechanical Gaming Keypad Wired Gaming Keyboard
Mouse: Logitech G700s Wireless Laser Mouse
Headphones: Creative Labs EVO ZxR 7.1 Channel  Headset
Speakers: Creative Labs GigaWorks T40 Series II 32W 2ch Speakers

Hades 1.0

Spoiler

Laptop: Dell Alienware 15 2015

CPU: i7-4720HQ CPU

Memory: 16GB DDR3 SODIMM RAM

Storage: 256GB M.2 SSD

Storage: 1TB 5400rpm 2.5" HDD

Screen: 15.6" FHD Display

Video Card: Nvidia GTX 970M with 3GB

Operating System: Windows 10 Pro

Project: Spartan 1.2 PLEASE SUPPORT ME NEW CHANNEL > Tech Inquisition

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GidonsClaw said:

I wouldn't purely call it an uArch issue, it's also driver interpretation of compute instructions to pass to the CUDA's... The 1080 drivers are very new, AMD has had a long time to sort out their stuff since Fury X released....

AMD has been playing with Async compute since Mantle which was 4 years ago or there abouts. Most of the oxide engine (used in AotS) was used in the Star Swarm demo for Mantle. 

Link to comment
Share on other sites

Link to post
Share on other sites

OZ? Does he mean Australia? If so where is he getting the reliable Internet? lol

 

Didn't they only show one benchmark though? Why not more games? I don't believe there lying, but I definitely think there hiding something if this was the only game benchmarked.

Link to comment
Share on other sites

Link to post
Share on other sites

I was going to put my money on the GTX 1070...

 

but I guess not...

Blue Jay

CPU: Intel Core i7 6700k (OC'd 4.4GHz) Cooler: CM Hyper 212 Evo Mobo: MSI Z170A Gaming Pro Carbon GPU: EVGA GTX 950 SSC RAM: Crucial Ballistix Sport 8GB (1x8GB) SSD: Samsung 850 EVO 250 GB HDD: Seagate Barracuda 1TB Case: NZXT S340 Black/Blue PSU: Corsair CX430M

 

Other Stuff

Monitor: Acer H236HL BID Mouse: Logitech G502 Proteus Spectrum Keyboard: I don't even know Mouse Pad: SteelSeries QcK Headset: Turtle Beach X12

 

GitHub

Link to comment
Share on other sites

Link to post
Share on other sites

@App4that What I meant about 480

Sloth's the name, audio gear is the game
I'll do my best to lend a hand to anyone with audio questions, studio gear and value for money are my primary focus.

Click here for my Microphone and Interface guide, tips and recommendations
 

For advice I rely on The Brains Trust :
@rice guru
- Headphones, Earphones and personal audio for any budget 
@Derkoli- High end specialist and allround knowledgeable bloke

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, -BirdiE- said:

Yeah... Well if the shaders working improperly means the game looks better and is less taxing on a GPU, then please sign me up for the broken shading...

Looking at the lack of snow on the path/dirt road for the comparison during AMDs event lends credit to it being broken on the 1080.  I expected more from you than the level of pettiness that statement makes.

15 hours ago, TidaLWaveZ said:

 

~$140 price difference.  May be a deal breaker for certain budgets, but IMO justifies a single card setup.

Interesting maths, might I ask how did you arrive at that?  I assume you meant to say $240 since 700-230-230 =/= 140

Link to comment
Share on other sites

Link to post
Share on other sites

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, DXMember said:

Cheating in benchmarks has been going on for countless years... something needs to change.  I want to see a $1 billion fine every time someone's found guilty.  This is serious stuff - people base their buying decisions on results like this and so it's no different than that massive VW emissions scandal.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Ryan_Vickers said:

Cheating in benchmarks has been going on for countless years... something needs to change.  I want to see a $1 billion fine every time someone's found guilty.  This is serious stuff - people base their buying decisions on results like this and so it's no different than that massive VW emissions scandal.

well the massive VW and their daughter companies together with the Japanese Mitsubishi are killing trees and fluffy animals with their emission scandal

but I do agree with you - they should build an assembly factory in Europe and sell the ICs and microprocessors at a discount for that

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Ryan_Vickers said:

Cheating in benchmarks has been going on for countless years... something needs to change.  I want to see a $1 billion fine every time someone's found guilty.  This is serious stuff - people base their buying decisions on results like this and so it's no different than that massive VW emissions scandal.

The thing is, it's well known... the only thing Nvidia guarantees is the base clock, boost clock is an estimation, but GPU Boost will always try and get the best clock range it can get within limits. So technically it is not cheating... you are getting the best you can get...

 

AMD on the other hand advertises a boost clock, but generally the card will throttle down from that to match the environment. So essentially most AMD owners always get less than they thought...

Spartan 1.0

Spoiler

CPU: Intel Core i7-4770K 3.5GHz Quad-Core Processor

CPU Cooler: Cooler Master Seidon 120XL 86.2 CFM Liquid CPU Cooler

Motherboard: Asus Maximus VI Extreme ATX LGA1150 Motherboard
Memory: Corsair Dominator 32GB (4 x 8GB) DDR3-1600 Memory
Storage: OCZ Vector Series 512GB 2.5" Solid State Drive
Storage: Seagate Desktop HDD 4TB 3.5" 7200RPM Internal Hard Drive

Video Card: EVGA GeForce GTX 980 4GB Classified ACX 2.0 Video Card
Case: Thermaltake Urban S41 ATX Mid Tower Case
Power Supply: Corsair 1200W 80+ Platinum Certified Fully-Modular ATX Power Supply
Optical Drive: LG BH16NS40 Blu-Ray/DVD/CD Writer
Optical Drive: LG BH10LS30 Blu-Ray/DVD/CD Writer
Operating System: Microsoft Windows 10 Pro 64-bit
Sound Card: Creative Labs ZXR 24-bit 192 KHz Sound Card
Monitor: 2x Asus VG278HE 27.0" 144Hz Monitor
Keyboard: Logitech G19s Wired Gaming Keyboard
Keyboard: Razer Orbweaver Elite Mechanical Gaming Keypad Wired Gaming Keyboard
Mouse: Logitech G700s Wireless Laser Mouse
Headphones: Creative Labs EVO ZxR 7.1 Channel  Headset
Speakers: Creative Labs GigaWorks T40 Series II 32W 2ch Speakers

Hades 1.0

Spoiler

Laptop: Dell Alienware 15 2015

CPU: i7-4720HQ CPU

Memory: 16GB DDR3 SODIMM RAM

Storage: 256GB M.2 SSD

Storage: 1TB 5400rpm 2.5" HDD

Screen: 15.6" FHD Display

Video Card: Nvidia GTX 970M with 3GB

Operating System: Windows 10 Pro

Project: Spartan 1.2 PLEASE SUPPORT ME NEW CHANNEL > Tech Inquisition

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Demonking said:

because AMD makes their own CPU's which compete with intel, so by using their competition product instead of their own to test another of their own product that just shows they have no faith in said product, but expect people to buy it. how is this so hard to understand.  

Intel has the i7 980X from 4 years ago, why aren't they using their own product?

[Out-of-date] Want to learn how to make your own custom Windows 10 image?

 

Desktop: AMD R9 3900X | ASUS ROG Strix X570-F | Radeon RX 5700 XT | EVGA GTX 1080 SC | 32GB Trident Z Neo 3600MHz | 1TB 970 EVO | 256GB 840 EVO | 960GB Corsair Force LE | EVGA G2 850W | Phanteks P400S

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Server 01: Intel Xeon D 1541 | ASRock Rack D1541D4I-2L2T | 32GB Hynix ECC DDR4 | 4x8TB Western Digital HDDs | 32TB Raw 16TB Usable

Server 02: Intel i7 7700K | Gigabye Z170N Gaming5 | 16GB Trident Z 3200MHz

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×