Jump to content

Amd's fiji series.

monaka25

What is your thoughts on AMD's "Fiji" lineup?

it will still consume more power than a 980ti, it will still have bad drivers and poor driver support and lots of issues, it will not perform as good as a 980ti in most games outside of the AMD sponsored games and it will still not have gameworks, physx, shadowplay, cuda, Gsync, VXGI, MFAA, GPU boost 2.0 and ALL the other great technologies built into maxwell...nobody should trow 700$ at an AMD GPU...EVER!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

it will still consume more power than a 980ti, it will still have bad drivers and poor driver support and lots of issues, it will not perform as good as a 980ti in most games outside of the AMD sponsored games and it will still not have gameworks, physx, shadowplay, cuda, Gsync, VXGI, MFAA, GPU boost 2.0 and ALL the other great technologies built into maxwell...nobody should trow 700$ at an AMD GPU...EVER!

 

I'm not sure I agree. It's not like Nvidia's driver support has been great lately, with the last two releases crashing on Maxwell cards all the time (I had to roll back from 352.86 to 350.12 because I got sick of the driver crashing every 10 minutes as I browsed the web) and with Witcher 3's performance being gimped on Kepler. I'll concede AMD's DirectX 11 drivers are crap if you're running a low end CPU like an i3 or an FX-8350, but for i5 or better AMD's performance is pretty solid. If the Fury X has 980 Ti performance and comes watercooled out the box I think it would be a better buy than at least the loud reference 980 Ti. Now the Gigabyte Gaming GTX 980 Ti looks incredible though. Not that I'll buy either since I'm mostly happy with my GTX 970 right now (on 350.12 that is) and probably won't upgrade until it starts sucking at 1080p. I still am kind of pissed I can't play Witcher 3 yet with optimized drivers without bringing all the crashes back though.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not sure I agree. It's not like Nvidia's driver support has been great lately, with the last two releases crashing on Maxwell cards all the time (I had to roll back from 352.86 to 350.12 because I got sick of the driver crashing every 10 minutes as I browsed the web) and with Witcher 3's performance being gimped on Kepler. I'll concede AMD's DirectX 11 drivers are crap if you're running a low end CPU like an i3 or an FX-8350, but for i5 or better AMD's performance is pretty solid. If the Fury X has 980 Ti performance and comes watercooled out the box I think it would be a better buy than at least the loud reference 980 Ti. Now the Gigabyte Gaming GTX 980 Ti looks incredible though. Not that I'll buy either since I'm mostly happy with my GTX 970 right now (on 350.12 that is) and probably won't upgrade until it starts sucking at 1080p. I still am kind of pissed I can't play Witcher 3 yet with optimized drivers without bringing all the crashes back though.

Gsync and Gameworks are IMHO deal breakers from now on...i certainly want my next GPU and next monitor to be Gsync and more than capable for 1440p gaming...and i wan't the hairworks, the lightworks, the shadoworks, the everythingwecanrendersothatitlooksamazinglybetternowworks...i wan't it ALL.

 

Have you seen fuckin next batman? it's like: there you go you can have those crappy 1998 looking smoke effects, or these beautiful amazingly rendered volumetric physics reactive advanced fogging and smoke technology...700$ GPU man...1990's smoke...nahhh...

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

All I will say is this... The greens consume less power than the reds

Nuff said

Indus Monk = Indian+ Buddhist

Link to comment
Share on other sites

Link to post
Share on other sites

I never knew 4k resolutions could have its pixels reduced so it uses less VRAM that makes 4GB equivalent to 6GB. The main point of HBM was deliver better performance with lower power consumption and reduce the space required on the PCB.

 

VRAM is MOSTLY affected by screen resolutions. And like i said above, 4k is 4k, 4 gigabyte is 4 gigabyte. Having HBM still renders 4k and will use the same amount of VRAM because pixels on the monitor remain the same.

 

 

 

Actually, most vRAM usage is the game assets.

 

The actually difference in framebuffer size is pretty small (can't remember the calculation off the top of my head, but it's measured in less than 100 MB)  but the big things that eat up vRAM are textures (and devs preloading textures into vRAM that they don't really need), shadowmaps and post processing effects. (Also MSAA is a big one, x8 MSAA at 4K is easily 2.5 GB of VRAM usage.)

 

The only game that gives me a problem that can't easily be fixed is AC:Unity, there's something weird with that game's vRAM usage at 4K.  (It'll show 3.8 GB usage, but will stutter like it's exceeding it until I drop to 1440p)  But that game is a mess for a dozen reasons.

 

 

Unless you are running SLI (and even if you are in some cases) I'm not sure how 4GB is "lots" for 4k when every benchmark and their mother's running 4k w MOST eye candy turned on are hitting 20-50 depending on game.... So no, I'm not gonna be maxing out a 4k panel with a single 4gb GPU any time soon or 6 for that matter...

 

4GB is lots because most games don't exceed it. (Until you start trying MSAA, which isn't viable past 2x at 4K at all for performance reasons anyway.)  

 

People think because they see their card using 4 GB at 1080p, that must mean 4K needs more vRAM.

 

It's a common mistake I'm just hoping to clear up.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Gsync and Gameworks are IMHO deal breakers from now on...i certainly want my next GPU and next monitor to be Gsync and more than capable for 1440p gaming...and i wan't the hairworks, the lightworks, the shadoworks, the everythingwecanrendersothatitlooksamazinglybetternowworks...i wan't it ALL.

 

Have you seen fuckin next batman? it's like: there you go you can have those crappy 1998 looking smoke effects, or these beautiful amazingly rendered volumetric physics reactive advanced fogging and smoke technology...700$ GPU man...1990's smoke...nahhh...

 

We still don't know which side will win in GSync vs FreeSync, and so few games support PhysX on the GPU. And what's the performance impact of that smoke going to be on the new Batman? AMD has equivalents to the main Nvidia features except PhysX and Shadowplay. If you're interested in streaming then Nvidia definitely seems like the way to go, but otherwise it's not so cut and dry between Nvidia and AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

We still don't know which side will win in GSync vs FreeSync, and so few games support PhysX on the GPU. And what's the performance impact of that smoke going to be on the new Batman? AMD has equivalents to the main Nvidia features except PhysX and Shadowplay. If you're interested in streaming then Nvidia definitely seems like the way to go, but otherwise it's not so cut and dry between Nvidia and AMD.

 

AMD Gaming Evolved even offers comparable features to Shadowplay.

 

Personally I dislike GSYNC from the point of view that until I replace the monitor I'm kind of stuck with Nvidia.  If Nvidia and AMD both used Freesync technology, then I'd still have the choice.

 

But I think by the time I'm in need of a GPU upgrade there will be a 4K IPS panel or some such thing I can replace it with.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

All I will say is this... The greens consume less power than the reds

Nuff said

and?

Link to comment
Share on other sites

Link to post
Share on other sites

AMD Gaming Evolved even offers comparable features to Shadowplay.

 

Personally I dislike GSYNC from the point of view that until I replace the monitor I'm kind of stuck with Nvidia.  If Nvidia and AMD both used Freesync technology, then I'd still have the choice.

 

But I think by the time I'm in need of a GPU upgrade there will be a 4K IPS panel or some such thing I can replace it with.

 

I'm definitely sitting out buying an adaptive VSync monitor until one of the technologies kills off the other. As awesome as it would be for my 970 now, no way I'm locking myself out of possibly buying an AMD GPU next time. Especially when the monitors are so expensive right now.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm definitely sitting out buying an adaptive VSync monitor until one of the technologies kills off the other. As awesome as it would be for my 970 now, no way I'm locking myself out of possibly buying an AMD GPU next time. Especially when the monitors are so expensive right now.

Freesync only works well in a very tiny window of like from 45 to 70FPS depending on the monitor where as Gsync works from 1 FPS all the way up to 1000FPS...Gsync is clearly superior to freesync:

 

 

Listen to this carefuly, then you'll know which one you want...

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync only works well in a very tiny window of like from 45 to 70FPS depending on the monitor where as Gsync works from 1 FPS all the way up to 1000FPS...Gsync is clearly superior to freesync:

 

 

Listen to this carefuly, then you'll know which one you want...

Pcper, classy.

Regular human bartender...Jackie Daytona.

Link to comment
Share on other sites

Link to post
Share on other sites

and?

Don't you know the 27 YEARLY dollars extra needed have huge impact; whilst discussing GPU's that cost $700 on the spot and will be replaced in 1-2 years ?!?! ;)

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync only works well in a very tiny window of like from 45 to 70FPS depending on the monitor where as Gsync works from 1 FPS all the way up to 1000FPS...Gsync is clearly superior to freesync:

 

 

Listen to this carefuly, then you'll know which one you want...

 

Freesync works the same as Gsync  as you say "1fps to 1000fps" That 45 to 75 window is only specific to certain monitors because the monitor manufacturers sets the window with Freesync where as Nvidia sets the standard for Gsync. There are already new Freesync monitors that have much larger windows. Do alittle research. Linus did a video on a Freesync BenQ monitor with a much larger window.

Link to comment
Share on other sites

Link to post
Share on other sites

it will still consume more power than a 980ti, it will still have bad drivers and poor driver support and lots of issues, it will not perform as good as a 980ti in most games outside of the AMD sponsored games and it will still not have gameworks, physx, shadowplay, cuda, Gsync, VXGI, MFAA, GPU boost 2.0 and ALL the other great technologies built into maxwell...nobody should trow 700$ at an AMD GPU...EVER!

 

Actually It should be as effecaint or more so than the 980ti, Their driver support has gotten loads better and on top of that they have been working in partnership with Microsoft working on DX12 which means Driver support will only get better. It should preform equal to or even better than the 980ti. Also where in the heck are you getting this 700? the water cooled version is only $650 and the air cooled is only $550 where as your standard 980ti is around 660. 

Link to comment
Share on other sites

Link to post
Share on other sites

While it wouldn't surprise me, I'm not gonna trust his videos. I want a reputable source. What I've heard of the video (I havent seen it so I could be wrong) is that the card he shows looks and has the same cooler as the 290x but AMD showed that they have new coolers on all of the line. So what's to say he wasn't just trolling, or that someone sold him a 290x and told him it was a 390x. Also correct me if im wrong AMD never released a stock cooled 290x 8gb did they? So at the very least we should see a difference from the amount of vram alone right?

Once again please feel free to correct me, this is mostly just my theories based on what I've been told and seen.

    The bestbuy that he bought it from actually did sell that 390x on accident. It wasn't a troll to say the least. The issue is he probably didn't OC it. Maybe that's something about 390x... who knows.  But it definitely wasn't a troll. Heck, even watch one of the netlinked daily episodes.. it even says it was a 390x he bought.. Idk which one tho.

MOTHERBOARD: some Asus motherboard CPU: I3-4130 GPU: Gigabyte(?) GT-1030 RAM: 8GB G.SKILL SNIPER 1600MHZ RAM + 4GB AMD ram PSU: Corsair CX450 CASE: Corsair Spec-03 OS: Win 10 64 bit Keyboard: Logitech G510s Mouse: Corsair G300s Camera: Canon EOS Rebel T5
I like outdoor warning sirens. Ask me anything about them.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

it will still consume more power than a 980ti, it will still have bad drivers and poor driver support and lots of issues, it will not perform as good as a 980ti in most games outside of the AMD sponsored games and it will still not have gameworks, physx, shadowplay, cuda, Gsync, VXGI, MFAA, GPU boost 2.0 and ALL the other great technologies built into maxwell...nobody should trow 700$ at an AMD GPU...EVER!

   Well.. there is no proof of that, this is a completely new architecture. A new chip, It is  just a single GPU chip, and it has 4096 stream processors, yes it may consume more power, but it is actually most likely to be more powerfull.

MOTHERBOARD: some Asus motherboard CPU: I3-4130 GPU: Gigabyte(?) GT-1030 RAM: 8GB G.SKILL SNIPER 1600MHZ RAM + 4GB AMD ram PSU: Corsair CX450 CASE: Corsair Spec-03 OS: Win 10 64 bit Keyboard: Logitech G510s Mouse: Corsair G300s Camera: Canon EOS Rebel T5
I like outdoor warning sirens. Ask me anything about them.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

- AMD has released a card that is more powerful than the 980ti, at the same price,

- as well as possibly more powerful than the Titan x, at 350$ less. I think AMD has won this time around.

- I think AMD has shot NVidia out of the sky.

- Even if it isn't exactly the performance of the Titan X, it will still be better than the 980ti, and at the same price.

- a card that's generally better then the Titan X, at a lower price..

 

Well.. there is no proof of that, this is a completely new architecture. A new chip, It is  just a single GPU chip, and it has 4096 stream processors, yes it may consume more power, but it is actually most likely to be more powerfull.

Fair enough, but remember this was an answer to your original post which there was ''no proof of that'' either...highligted above are all the suppositions you took for a fact in your original post son.

You must admit your original post was quite something?!

Also where in the heck are you getting this 700? the water cooled version is only $650

650$ + tax and shipping will end-up well above 700$...and this is in the US...here in canada it will be like 829$...+tax +shipping...like the 980ti...

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

650$ + tax and shipping will end-up well above 700$...and this is in the US...here in canada it will be like 829$...+tax +shipping...like the 980ti...

 

It's pretty nice, I live in Texas and order from newegg in California, so I don't get charged sales tax on hardware. 

Link to comment
Share on other sites

Link to post
Share on other sites

where i live i also dont pay newegg tax and they provide free shipping with eggsaver which i found to be as fast as 3 day shipping..tigerdirect is different for me but free for others depending what state you're in

cpu:i7-4770k    gpu: msi reference r9 290x  liquid cooled with h55 and hg10 a1     motherboard:z97x gaming 5   ram:gskill sniper 8 gb

Link to comment
Share on other sites

Link to post
Share on other sites

 

AMD talked about optimizing drivers for those 4gb's, I believe it will be enough, 4gb's is still 4gb's but if they can use it efficiently. Kinda like how in maxwell L2 cache help take off load of the memory bandwidth. It's 512gb/s compared to 336gb/s. is what I think will greatly help in making the efficiency it needs to work well with 4gb.

 

 

Wasn't DirectX12 also suppose to help reduce VRam requirements by reusing textures already in the memory instead of reloading them for different objects of something along those lines. I could have sworn I read something like that. This would also help fight against the technically lower amount of memory available.

 

Another thing I head is that with this very large bandwidth they swap what it the memory a lot quicker so chunks accessed only briefly can be reopened for other things. That would of course mean more communication with the other pieces of the motherboard but hey we might finally saturate the pcie3 lanes.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Fair enough, but remember this was an answer to your original post which there was ''no proof of that'' either...highligted above are all the suppositions you took for a fact in your original post son.

You must admit your original post was quite something?!

650$ + tax and shipping will end-up well above 700$...and this is in the US...here in canada it will be like 829$...+tax +shipping...like the 980ti...

The air cooled Fury will only be 550!! And yes if you watched the presentation you would know the only difference between it and the X is the water cooler.

Link to comment
Share on other sites

Link to post
Share on other sites

and?

and that is how things will be in the future

Indus Monk = Indian+ Buddhist

Link to comment
Share on other sites

Link to post
Share on other sites

and that is how things will be in the future

power consumption doesn't mean shit.

Link to comment
Share on other sites

Link to post
Share on other sites

Main ||  i7-4790K @ 4.4GHz || Corsair H90 || Gigabyte GA-Z97X-UD3H-BK || HyperX Fury 16GB || Samsung 840 EVO 500GB || WD Green 4TB || EVGA GTX 750 Ti FTW || Fractal Core 3500 || Corsair CX 500M ||


 


Wiseplex || i7-4790 @ 3.6GHz || Corsair H50 || Gigabyte GA-Z97N-WIFI || HyperX Fury 16GB || Samsung 840 EVO 250GB || WD Green 6TB || EVGA GTX 750 Ti SC || Fractal Node 605 || Corsair CSM 450 ||

Link to comment
Share on other sites

Link to post
Share on other sites

yeah that's what i was talking about...not enough of a performance difference to justify the lost of all the great nvidia features built into maxwell. I would go with a 980ti any day of the week...the 980ti is probably more energy efficient as well and will have better driver support and better support from game developpers...as it's always the case with nvidia.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×