Jump to content

AMD is going to be in a tough spot at the high bc of the 980ti

Embattled

Their HBM memory will be the determining factor, especially if it compliments DX12. However, they will definitely need aggressive pricing on the 390 and 390x because all other 300 series cards are just a bunch of rebrands again. 

Handing out TechRx's

Link to comment
Share on other sites

Link to post
Share on other sites

AMD's goal should be a high end graphics card that can play all current games at 60fps @4K resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

what if they make the 295x2 the new 390x?! just an air cooled version, that would pip all the cards upto the titan.

AMD Fury is the new 390X and the 390X might as well be the new 390 :P

Regular human bartender...Jackie Daytona.

Link to comment
Share on other sites

Link to post
Share on other sites

That would be a horrible decision. Dual-GPU cards have their own set of complications and drawbacks. Single GPU is always better.

the new fury card with its HBM will have its complications and drawbacks, they'll be expecting game developers to code games to work with the HBM.. if they dont it could just be as much of a lemon as a non sli/cfx coded game with a sli/cfx setup

Gaming PC: • AMD Ryzen 7 3900x • 16gb Corsair Vengeance RGB Pro 3200mhz • Founders Edition 2080ti • 2x Crucial 1tb nvme ssd • NZXT H1• Logitech G915TKL • Logitech G Pro • Asus ROG XG32VQ • SteelSeries Arctis Pro Wireless

Laptop: MacBook Pro M1 512gb

Link to comment
Share on other sites

Link to post
Share on other sites

the new fury card with its HBM will have its complications and drawbacks, they'll be expecting game developers to code games to work with the HBM.. if they dont it could just be as much of a lemon as a non sli/cfx coded game with a sli/cfx setup

Do you have any source for this? I've not heard one mention of issues developing with HBM in mind. As far as I'm aware, the Dev codes through the API, which just sends commands to the memory controller - the API shouldn't give a shit what kind of memory is used, since all it sees is the memory controller, which should have standardized commands.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Do you have any source for this? I've not heard one mention of issues developing with HBM in mind. As far as I'm aware, the Dev codes through the API, which just sends commands to the memory controller - the API shouldn't give a shit what kind of memory is used, since all it sees is the memory controller, which should have standardized commands.

I'm trying to find it! it was an interview with amd on the HBM subject, and it mentioned about educating developers to code for HBM to make it more beneficial. ill keep looking lol

Gaming PC: • AMD Ryzen 7 3900x • 16gb Corsair Vengeance RGB Pro 3200mhz • Founders Edition 2080ti • 2x Crucial 1tb nvme ssd • NZXT H1• Logitech G915TKL • Logitech G Pro • Asus ROG XG32VQ • SteelSeries Arctis Pro Wireless

Laptop: MacBook Pro M1 512gb

Link to comment
Share on other sites

Link to post
Share on other sites

I'm trying to find it! it was an interview with amd on the HBM subject, and it mentioned about educating developers to code for HBM to make it more beneficial. ill keep looking lol

Reading that, I would interpret that to mean that they may need to learn new coding methods to extract additional bonus performance out of it, but it would not perform less than GDDR5.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Ehh not really the r9 295x2 performs almost the same as the gtx980 tiand has 2 gb more of vRAM. Not a fanboy though I actually am not a fan of AMD.

Though AMD is know for there budget cards

 

The 295x2 is a dual GPU card that has some very specific requirements when it comes to PSU and case. you shouldn't be comparing a dual GPU card with 500W TDP to a single GPU with 250W TDP, even if they perform close to each other.

 

That being said, "almost the same" is BS. The 295X2 is considerably faster than a titan X, check the benchmarks in anandtech's 980Ti review:

http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/4

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×