Jump to content

PowerColor Releases Ad for Upcoming AMD GPU

nunya bus

Guys... 4GB is fine for 4k, sure you can always use more. But the real thing for VRAM is how quickly it can be loaded and used and unloaded when the texture is no longer needed. I don't know why we have come back to this "more GB mean it is a better card" mentality of the early/mid 2000's (hmmm, do I get a 6800GT with 128mb of vram or this 7300GT with 256mb? Answer: the 6800GT is the better card). I would much rather 4GB @ ~600GBPS versus 8GB @ ~300GBPS.

 

There is maybe 1 or two games that need more then 4GB @ 4k, and all information points to an 8GB version of the 390 being release later. Please don't make a big deal of somthing that clearly isn't a big deal.

^ People think 4GB is nothing for some reason... Witcher 3 at 1440p only uses 2GB for me. Star Citizen caps at 4GB though even on 1080p but that game isn't exactly optimised and uncompressed textures will do that.

i7 6700K - ASUS Maximus VIII Ranger - Corsair H110i GT CPU Cooler - EVGA GTX 980 Ti ACX2.0+ SC+ - 16GB Corsair Vengeance LPX 3000MHz - Samsung 850 EVO 500GB - AX760i - Corsair 450D - XB270HU G-Sync Monitor

i7 3770K - H110 Corsair CPU Cooler - ASUS P8Z77 V-PRO - GTX 980 Reference - 16GB HyperX Beast 1600MHz - Intel 240GB SSD - HX750i - Corsair 750D - XB270HU G-Sync Monitor
Link to comment
Share on other sites

Link to post
Share on other sites

DirectX 12 should do a much better job at keeping graphics cards fed. Keep in mind single thread performance will no longer be as mission critical. With up to six rendering threads all communicating directly with the GPU chips like the FX-6300 should do a much better job than what they're currently doing when it comes to feeding the GPU. It may end up being enough for the FX-6300 to be capable of feeding both the R9 290X and GTX 980 without a hassle. The need to pipe resources back to the main thread in order to get pushed through the graphics stack is what is currently crippling AMD chips. The main thread is where most of the game logic happens as well so having other threads that can talk directly with the GPU may be a helping solution to the problem.

Again, its up to the devs to take advantage of DX 12, and that won't be happening in current games. And BTW, your forgetting that Intel will benefit as well, thus negating any advantage AMD would see on their aging line of CPU's (3-4 years is very old for a CPU). And if anyone brings up mantle, all it did was cover the poor state of AMD's DX11, as Nvidia's DX 11 performsextremely close to Mantle.

Edit:Off topic but AMD's chips were crippled from the start. People were unimpressed with their lackluster performance at launch and even those who aren't tech minded noticed that a supposed upgrade from an AMD quad core to an '8 core' actually gave them a slower computer. (2 modules-4 i cores, 3 modules-6 i cores, 4 modules-8 i cores. What previously would have been dual, tri and quad core Phenom II CPU's were partly split to give the semblance of several true processing cores, which is far from the reality and the cause for AMD's IPC woes.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not even running 1080p...

#1050pmustardrace

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not even running 1080p...

#1050pmustardrace

1440x1050? (which my current laptop has)

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

1440x1050? (which my current laptop has)

 

No, 1680x1050

Link to comment
Share on other sites

Link to post
Share on other sites

No, 1680x1050

Oh....so the middle of the 16:9 range, not top of the 4:9

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Again, its up to the devs to take advantage of DX 12, and that won't be happening in current games. And BTW, your forgetting that Intel will benefit as well, thus negating any advantage AMD would see on their aging line of CPU's (3-4 years is very old for a CPU). And if anyone brings up mantle, all it did was cover the poor state of AMD's DX11, as Nvidia's DX 11 performsextremely close to Mantle.

Edit:Off topic but AMD's chips were crippled from the start. People were unimpressed with their lackluster performance at launch and even those who aren't tech minded noticed that a supposed upgrade from an AMD quad core to an '8 core' actually gave them a slower computer. (2 modules-4 i cores, 3 modules-6 i cores, 4 modules-8 i cores. What previously would have been dual, tri and quad core Phenom II CPU's were partly split to give the semblance of several true processing cores, which is far from the reality and the cause for AMD's IPC woes.

A lot of higher ups made it clear that they want to push DirectX 12 hard onto the industry and I wouldn't doubt that Microsoft will be another one due to Windows 10 sales. I'm not forgetting that Intel will see the same helping hand although it will become more apparent on AMD offerings due to the current situation (Intel can already hold GPU's to their performance wall). DirectX 12 will help to lower frame times which will help increase frame rate for everyone although if you're currently not utilizing your GPU to its fullest potential there's even more performance to be had. One of the biggest problems with AMD right now is frame rate dips. The lack of IPC causes a utilization ripple on GPU's that's not suppose to occur (indicates not enough CPU performance). What Nvidia has done was they jumped the gun with a threaded driver model. Their DirectX 11 driver is much better than AMD's DirectX 11 driver for this very reason. Anything past two cores and AMD's driver stops scaling completely meanwhile Nvidia's driver will continue to scale up to four cores. It's helped give them a competitive edge in the market right now although once DirectX 12 launches AMD will be able to squeeze more performance out of their driver than Nvidia. Since they've already juiced their drivers for some extra threading capacity. This doesn't mean that AMD hardware is going to out perform the competition it just means we can expect to see a bigger performance increase out of AMD hardware once DirectX 12 comes into play (as we've covered with the 3DMark API Overhead testing thread). In that regard theoretically AMD graphics should run better on Intel based platforms for the simple fact of how serialized the driver is.

Link to comment
Share on other sites

Link to post
Share on other sites

You'll have to opt out for a GTX 980 or R9 290X as there's nothing stronger than them at that price point. The 980 Ti will be up there with the TITAN X as will Fiji. So personally if you're not looking to go multiple GPU in the future and just want the best performance that you can get right now for your dollar the R9 295x2 is the ticket. It's marginally faster than the GTX 980 while also being a single card (dual Hawaii). It even runs cooler than the GTX 980 but pulls a hell of a lot of more power (not a big issue as it is a dual GPU card so 2x the power is expected). That's where I'd spend my money if I was looking for an upgrade and only plan on running a single card.

 

Oh hehe no no, not right now, my 2 780s are doing just fine. What I mean is in the future(maybe next year?) I want a single card affordable and able to play all the games ultra at 60fps in 1440p. My cards are fine but I'm getting tired of having 2 cards in there... :( 

 

But thank you for the recommendations and yep you are right.

●CPU: i7-4790K w/H100i ●Mobo: MSI Z97 MPower ●RAM: Corsair 16GB Dominator ●GPU: EVGA ACX SC 780 3GB(X2) ●SSD: 850 Pro 256GB ●Case: 450D ●PSU: AX 860i ●Monitor: Asus PB278Q 1440p

Link to comment
Share on other sites

Link to post
Share on other sites

Damn it, I just need to know how fast it can run Skyrim  :lol:

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Oh hehe no no, not right now, my 2 780s are doing just fine. What I mean is in the future(maybe next year?) I want a single card affordable and able to play all the games ultra at 60fps in 1440p. My cards are fine but I'm getting tired of having 2 cards in there... :(

 

But thank you for the recommendations and yep you are right.

It's one of them things where we will have to wait and see what happens. AMD has some big plans in the works for Arctic Islands and Nvidia CEO is throwing around CEO math again with Pascal. So we really don't know what to expect until the products hit the market and we can get our hands on them.

 

Damn it, I just need to know how fast it can run Skyrim   :lol:

Current cards scream through Skyrim with HD texture packs, unless you're playing above FHD.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×