Jump to content

Can't decide on gpu for 1440p upgrades

Prokart2000

What Gpu?  

40 members have voted

  1. 1. What Gpu

    • 1060
      13
    • RX480
      4
    • R9 Fury
      20
    • 980
      4


6 minutes ago, i_build_nanosuits said:

AMD and Nvidia both have pros and cons...nvidia generally have better day 1 driver support for games and they do have a few cool features and softwares (shadowplay for example)

 

1. AMD also has their own "Shadowplay" 

2. Why the hell are yall still yapping on shadowplay like its the shit? Study the features please. Use the "LED VISUALISER" for example. That is the feature that AMD has zero answers to. 

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Prokart2000 said:

None on overclokcers, scan had xfx ones for £260

Tough choice imo.

 

If 390X is priced the same as a GTX 1060, 980 is not worth it and Fury is too expensive to be worth it, you've got three choices imho:

 

RX 480 - The slowest card, hopefully also the cheapest of the three, is promising for the future with new APIs and you can crossfire two of them on most PSUs.


GTX 1060 - Faster than RX 480, has 6GB of VRAM, no SLI support but a better choice than a GTX 980, account all Nvidia-exclusive features and good DX11 performance

 

R9 390X - Theoretically the fastest of the three (but it depends on the game), should see biggest boosts when transferring to DX12/Vulkan as it's very hardware-beefy, but the chip itself is fairly old, as I've got the same one in my 290X. Performs GREAT at 1440p and beats a GTX 980 at that resolution, has 8GB of VRAM BUT draws significantly more power and is a hotter card than the other two. For some it's not an issue, I don't mind my 290X hitting 80 degrees when gaming as it's still fairly silent and it's still 15 degrees below its max temp, and I don't mind it drawing some more power than other cards. Electricity is not expensive here at all though.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Morgan MLGman said:

The cheapest non-reference GTX 1060 and the cheapest R9 Fury. If the 1060 is more expensive than the Fury, it's not worth it cause it's slower.

 

Avoid ASUS and Gigabyte for AMD cards.

that 4GB frame buffer would really worry me for 1440p gaming in the future...honestly...it's not AMD or nvidia...it's 4GB vs 6GB.

i wouldn't recommend a 4GB nvidia card at this point either...but i've seen to many games already pushing 5GB and beyond on my GTX 980ti and i really don't think that playing with lower texture quality in games is a good thing...texture quality is what make games look good IMHO...and a 4GB framebuffer will be a limitation in a year or 2 that's a given...recent games all use at least 4GB at 1440p if you push the settings a bit...and rendering with higher quality textures does not impact your performance, it just require you to have more VRAM to avoid stuttering.

 

The Fury is indeed faster than the 1060, but it's 4GB of video memory will have an impact on games in the future that's a given.

 

That said, quite often if you drop texture quality from ULTRA to HIGH it won't change much visualy but you save like 750mb of VRAM.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, i_build_nanosuits said:

that 4GB frame buffer would really worry me for 1440p gaming in the future...honestly...it's not AMD or nvidia...it's 4GB vs 6GB.

i wouldn't recommend a 4GB nvidia card at this point either...but i've seen to many games already pushing 5GB and beyond on my GTX 980ti and i really don't think that playing with lower texture quality in games is a good thing...texture quality is what make games look good IMHO...and a 4GB framebuffer will be a limitation in a year or 2 that's a given...recent games all use at least 4GB at 1440p if you push the settings a bit...and rendering with higher quality textures does not impact your performance, it just require you to have more VRAM to avoid stuttering.

 

The Fury is indeed faster than the 1060, but it's 4GB of video memory will have an impact on games in the future that's a given.

 

That said, quite often if you drop texture quality from ULTRA to HIGH it won't change much visualy but you save like 750mb of VRAM.

Nah, considering he said that the Fury is almost 70 pounds more expensive than the 1060 it's not worth it at all, let's abandon that idea, read my post above yours now for my updated statement :P

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Pohernori said:

 

1. AMD also has their own "Shadowplay" 

2. Why the hell are yall still yapping on shadowplay like its the shit? Study the features please. Use the "LED VISUALISER" instead. That is the feature that AMD has zero answers to. 

i can name ALL of them but then you'll say i'm a fanboy:

 

-Simultaneous multi-projection

-FXGI

-Ansel

-half precision compute

-CUDA support

-PhysX support

- conservative rasterization

- rasterization order view

-memory compression

-tiled ressources

-VR works audio

-Fast Sync

-Dynamic load balancing and pre-emption

 

did i forgot any? i probably did.

 

+ all the stuff that fall into the ''gameworks'' category such as MFAA, HBAO+ etc.

 

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Morgan MLGman said:

Nah, considering he said that the Fury is almost 70 pounds more expensive than the 1060 it's not worth it at all, let's abandon that idea, read my post above yours now for my updated statement :P

like you said though...the R9 390X is an old old chip and the driver optimisation for it is completed...where as the GTX 1060 is brand new and the driver optimisation for it hasn't even begun...i wouldn't be surprised at all if the 1060 in a month or two would gain +10/+15% on average through driver optimisation...rememeber: nvidia has the money to sponsor titles...they also have money to optimise the drivers as well and this hasn't been done yet...the 390X is mature...it's a grand-pa in fact...and the GTX 1060 is a toddler.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, i_build_nanosuits said:

like you said though...the R9 390X is an old old chip and the driver optimisation for it is completed...where as the GTX 1060 is brand new and the driver optimisation for it hasn't even begun...i wouldn't be surprised at all if the 1060 in a month or two would gaing +10/+15% on average through driver optimisation...rememeber: nvidia has the money to sponsor titles...they also have money to optimise the drivers as well and this hasn't been done yet...the 390X is mature...it's a grand-pa in fact...and the GTX 1060 is a toddler.

Yup, these might or might not be true, that is why I said OP has a tough choice. I'm not sure what I would do, I think I'd look into used market and look at used 980Tis.

GTX 1060 is great, but the lack of SLI kills me a little, considering low little energy it draws, It's a perfect candidate to SLI...

 

This new kitguru review of STRIX RX 480 looks promising when accounting overclocking results:

 

x3dmark-oc3.png.pagespeed.ic.-1gQTHbH9c.

 

BUT my question is, why is the GTX 1060 so low? Lower than a stock STRIX 480? Looks rather weird.

 

full review link: http://www.kitguru.net/components/graphic-cards/zardon/asus-rx-480-strix-gaming-oc-aura-rgb-8192mb/30/

 

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, i_build_nanosuits said:

like you said though...the R9 390X is an old old chip and the driver optimisation for it is completed...where as the GTX 1060 is brand new and the driver optimisation for it hasn't even begun...i wouldn't be surprised at all if the 1060 in a month or two would gain +10/+15% on average through driver optimisation...rememeber: nvidia has the money to sponsor titles...they also have money to optimise the drivers as well and this hasn't been done yet...the 390X is mature...it's a grand-pa in fact...and the GTX 1060 is a toddler.

About freesync, there is a 2560x1080 widescreen monitor that has freesync for the same price as the 1440p one

My Setup :P

Spoiler

Skylake: I7-6700|MSI B150 GAMING M3|16GB GSKILL RIPJAWS V|R9 280X (WILL BE 1070)|CRUCIAL MX300 + WD BLACK 1TB

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Prokart2000 said:

About freesync, there is a 2560x1080 widescreen monitor that has freesync for the same price as the 1440p one

FreeSync should be significantly cheaper than GSync and should provide great experience for you, if you pair it with a, say, 390X, you'd be great. Though RX 480 is also interesting if you're looking to crossfire in the future, as two 390Xs would draw quite a bit of power and require proper airflow to handle. Eh, decisions decisions.

 

There's also the thing that you can crossfire a 390X with another 390X, an R9 390, R9 290X or an R9 290.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Prokart2000 said:

About freesync, there is a 2560x1080 widescreen monitor that has freesync for the same price as the 1440p one

hummm...you'll have to look in ultrawide gaming and reviews to figure out if it's a thing for you or not...

lack of support and just general ''wierdness'' of those things turned me away from buying one...personaly i love a 16:9 format with higher resolution...you might think otherwise depending on what games you play...but not all games support 21:9 ratio natively...many will just stretch out to fill the screen (disgusting) or add black bars on the sides in which case you end up with a 1920x1080 gaming monitor with large bezels :P

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, i_build_nanosuits said:

hummm...you'll have to look in ultrawide gaming and reviews to figure out if it's a thing for you or not...

lack of support and just general ''wierdness'' of those things turned me away from buying one...personaly i love a 16:9 format with higher resolution...you might think otherwise depending on what games you play...but not all games support 21:9 ratio natively...many will just stretch out to fill the screen (disgusting) or add black bars on the sides in which case you end up with a 1920x1080 gaming monitor with large bezels :P

I'd also recommend going 1440p route. Ultrawide is weird. If you can afford a FreeSync/GSync monitor that'd be even better.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, i_build_nanosuits said:

hummm...you'll have to look in ultrawide gaming and reviews to figure out if it's a thing for you or not...

lack of support and just general ''wierdness'' of those things turned me away from buying one...personaly i love a 16:9 format with higher resolution...you might think otherwise depending on what games you play...but not all games support 21:9 ratio natively...many will just stretch out to fill the screen (disgusting) or add black bars on the sides in which case you end up with a 1920x1080 gaming monitor with large bezels :P

It could be good for csgo if it works, and i would love a massive 29inch monitor here is a link to it http://www.lg.com/uk/monitors/lg-29UM68 see if that has the good freesync?

My Setup :P

Spoiler

Skylake: I7-6700|MSI B150 GAMING M3|16GB GSKILL RIPJAWS V|R9 280X (WILL BE 1070)|CRUCIAL MX300 + WD BLACK 1TB

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Prokart2000 said:

It could be good for csgo if it works, and i would love a massive 29inch monitor here is a link to it http://www.lg.com/uk/monitors/lg-29UM68 see if that has the good freesync?

well...you would think it would be...but if you check reviews from experienced/pro FPS shooters...they can't use it...there is something wierd about it that makes your aim very f*cked up...it's not as accurate and seeing more at once mean you'll likely miss important details that would otherwise would have catched your attention...pro FPS players believe it or not for the most part still use 1920x1080 or even 720p monitors...i think it looks like arse so i use 2560x1440 myself and i do play quite a bit of online shooters myself.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, i_build_nanosuits said:

i can name ALL of them but then you'll say i'm a fanboy:

 

-Simultaneous multi-projection

-FXGI

-Ansel

-half precision compute

-CUDA support

-PhysX support

- conservative rasterization

- rasterization order view

-memory compression

-tiled ressources

-VR works audio

-Fast Sync

-Dynamic load balancing and pre-emption

 

did i forgot any? i probably did.

 

 

Well. If you continued to act the way you were earlier. I would have definitely called you out again. 

 

Few things, 

conservative rasterization and rasterization order view are a DX12 feature, as with tiled resources. 

You'd need dev support for Ansel but it is a legit feature. 

Compute is for another discussion.

CUDA, OpenCL

PhysX, TressFX, GPUOpen

AMD also has memory compression unfortunately, DCC. 

VR work audio, TrueAudio

lol Async. 

 

A lil bit more research would've helped you. 

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, i_build_nanosuits said:

well...you would think it would be...but if you check reviews from experienced/pro FPS shooters...they can't use it...there is something wierd about it that makes your aim very f*cked up...it's not as accurate and seeing more at once mean you'll likely miss important details that would otherwise would have catched your attention...pro FPS players believe it or not for the most part still use 1920x1080 or even 720p monitors...i think it looks like arse so i use 2560x1440 myself and i do play quite a bit of online shooters myself.

ok, i will consider it

My Setup :P

Spoiler

Skylake: I7-6700|MSI B150 GAMING M3|16GB GSKILL RIPJAWS V|R9 280X (WILL BE 1070)|CRUCIAL MX300 + WD BLACK 1TB

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

R9 Fury? it uses HBM.

Core i7 6700k @ 4800mhz 1.33v * 16GB Corsair Vengeance LPX @ 3000mhz * ASUS Z170-A * Corsair H100i * ASUS GTX 1080 STRIX @ 2100mhz * XFX Pro Black Edition 80+ Gold 850w * Phanteks ECLIPSE P400S * AOC U2879VF 28" 4K

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Pohernori said:

 

Well. If you continued to act the way you were earlier. I would have definitely called you out again. 

 

Few things, 

conservative rasterization and rasterization order view is a DX12 feature, as with tiled resources. 

You'd need dev support for Ansel but it is a legit feature. 

Compute is for another discussion.

CUDA, OpenCL

PhysX, TressFX, GPUOpen

AMD also has memory compression unfortunately, DCC. 

VR work audio, TrueAudio

lol Async. 

 

A lil bit more research would've helped you. 

DX12 feature is great...why you call me on that?

compute is compute...wheter it's relevant to the user or not is another story i agree...but it's there and it's cool.

Cuda and openCL --> Funfact nvidia support BOTH...AMD doesnt

for physX...yeah ok...but there are many other that fall into the gameworks features such as MFAA, TXAA, and HBAO+ for example

memory compression is a win

VR works and true audio? true audio really? is that the same? i don't know...ray traced audio is it?

Async: refer to conservative rasterization, rasterization order view, tiled resources, hardware memory compresion, pre-emption and dynamic load balancing...

 

lets not lie though, the greatest and most noteable of ALL those cool features are shadowplay (yes AMD has a trashy software that can do it but nvidia's implementation is much better with nearly zero impact on performance and it just work great because nvidia GPU's have a built in H264 encoder on the GPU where as AMD cards does not) and IMHO simultaneous multi-projection is a game changer for anybody with a tripple monitor setup, ultrawide monitor or VR headset.

All the rest is cool, but not something you should really consider too deeply.

 

we agree?

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, i_build_nanosuits said:

DX12 feature is great...why you call me on that?

compute is compute...wheter it's relevant to the user or not is another story i agree...but it's there and it's cool.

Cuda and openCL --> Funfact nvidia support BOTH...AMD doesnt

for physX...yeah ok...but there are many other that fall into the gameworks features such as MFAA, TXAA, and HBAO+ for example

memory compression is a win

VR works and true audio? true audio really? is that the same? i don't know...ray traced audio is it?

Async: refer to conservative rasterization, rasterization order view, tiled resources, hardware memory compresion, pre-emption and dynamic load balancing...

 

lets not lie though, the greatest and most noteable of ALL those cool features are shadowplay (yes AMD has a trashy software that can do it but nvidia's implementation is much better with nearly zero impact on performance and it just work great because nvidia GPU's have a built in H264 encoder on the GPU where as AMD cards does not) and IMHO simultaneous multi-projection is a game changer for anybody with a tripple monitor setup, ultrawide monitor or VR headset.

All the rest is cool, but not something you should really consider too deeply.

 

we agree?

1060s https://www.overclockers.co.uk/search/index/sSearch/1060/sSort/3

480s https://www.overclockers.co.uk/search?sSearch=rx480

My Setup :P

Spoiler

Skylake: I7-6700|MSI B150 GAMING M3|16GB GSKILL RIPJAWS V|R9 280X (WILL BE 1070)|CRUCIAL MX300 + WD BLACK 1TB

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, i_build_nanosuits said:

DX12 feature is great...why you call me on that?

compute is compute...wheter it's relevant to the user or not is another story i agree...but it's there and it's cool.

Cuda and openCL --> Funfact nvidia support BOTH...AMD doesnt

for physX...yeah ok...but there are many other that fall into the gameworks features such as MFAA, TXAA, and HBAO+ for example

memory compression is a win

VR works and true audio? true audio really? is that the same? i don't know...ray traced audio is it?

Async: refer to conservative rasterization, rasterization order view, tiled resources, hardware memory compresion, pre-emption and dynamic load balancing...

 

lets not lie though, the greatest and most noteable of ALL those cool features are shadowplay (yes AMD has a trashy software that can do it but nvidia's implementation is much better with nearly zero impact on performance and it just work great because nvidia GPU's have a built in H264 encoder on the GPU where as AMD cards does not) and IMHO simultaneous multi-projection is a game changer for anybody with a tripple monitor setup, ultrawide monitor or VR headset.

All the rest is cool, but not something you should really consider too deeply.

 

we agree?

 

Oh lawd. Again more research would've greatly helped you. 

https://en.wikipedia.org/wiki/Video_Coding_Engine#VCE_3.0

Quote

AMD Video Codec Engine (VCE) is a full hardware implementation of the video codec H.264/MPEG-4 AVC. The ASIC is capable of delivering 1080p at 60 frames/sec. Because its entropy encoding block is also separately accessible Video Codec Engine can be operated in two modes: full-fixed mode and hybrid mode.

capture-bench-2.jpg

 

Performance decrease is also very minimal. 

 

More research, less fanboy. Please.

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, i_build_nanosuits said:

DX12 feature is great...why you call me on that?

compute is compute...wheter it's relevant to the user or not is another story i agree...but it's there and it's cool.

Cuda and openCL --> Funfact nvidia support BOTH...AMD doesnt

for physX...yeah ok...but there are many other that fall into the gameworks features such as MFAA, TXAA, and HBAO+ for example

memory compression is a win

VR works and true audio? true audio really? is that the same? i don't know...ray traced audio is it?

Async: refer to conservative rasterization, rasterization order view, tiled resources, hardware memory compresion, pre-emption and dynamic load balancing...

 

lets not lie though, the greatest and most noteable of ALL those cool features are shadowplay (yes AMD has a trashy software that can do it but nvidia's implementation is much better with nearly zero impact on performance and it just work great because nvidia GPU's have a built in H264 encoder on the GPU where as AMD cards does not) and IMHO simultaneous multi-projection is a game changer for anybody with a tripple monitor setup, ultrawide monitor or VR headset.

All the rest is cool, but not something you should really consider too deeply.

 

we agree?

As for Shadowplay, I used it when I had my 970, I also tried using the Plays.tv/Raptr or whatever it's called (AMD's counterpart), and while it also has little to no performance impact, it's much less intuitive and is a pain in the ass to set up properly, which is why I'm using OBS/Xsplit if I need something like that.

 

Shadowplay is better because it's easier to set up and an average user values that a lot. Performance impact is the same though.

CPU: AMD Ryzen 7 5800X3D GPU: AMD Radeon RX 6900 XT 16GB GDDR6 Motherboard: MSI PRESTIGE X570 CREATION
AIO: Corsair H150i Pro RAM: Corsair Dominator Platinum RGB 32GB 3600MHz DDR4 Case: Lian Li PC-O11 Dynamic PSU: Corsair RM850x White

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Pohernori said:

 

Oh lawd. Again more research would've greatly helped you. 

https://en.wikipedia.org/wiki/Video_Coding_Engine#VCE_3.0

capture-bench-2.jpg

 

Performance decrease is also very minimal. 

 

More research, less fanboy. Please.

I know, i was just showing the prices of all cards instead of typing each one up

My Setup :P

Spoiler

Skylake: I7-6700|MSI B150 GAMING M3|16GB GSKILL RIPJAWS V|R9 280X (WILL BE 1070)|CRUCIAL MX300 + WD BLACK 1TB

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Prokart2000 said:

do you care about backplates and look?

what color scheme it has...only you can check review but with the prices you have available DEFENETLY a GTX 1060.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

@i_build_nanosuits @ivan134 @Morgan MLGman

 

Let me make something explicitly clear. Drivers have fucking sucked for a year. I didn't say a company, just drivers. The reason everyone fights about drivers is they're both right. This is like politics, it's not about being a good option, just about not being the worst. It's embarrassing. And we let it happen.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Pohernori said:

 

Oh lawd. Again more research would've greatly helped you. 

https://en.wikipedia.org/wiki/Video_Coding_Engine#VCE_3.0

 

great so amd has video encoder too...good to know...not a reason to be an arse...you learn things everyday i'm sure you don't know everything either...i assumed they didn't had one on there since they don't have any good software to make use of it and shadowplay has been out for years.

Thumbs up, thanks for helping me learn something today.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, i_build_nanosuits said:

do you care about backplates and look?

what color scheme it has...only you can check review but with the prices you have available DEFENETLY a GTX 1060.

I was going to go founders for the black and green colour scheme (£275)

Is this monitor any good? https://www.overclockers.co.uk/asus-vx24ah-24-2560x1440-ips-widescreen-professional-gaming-led-zero-frame-monitor-black-mo-079-as.html 

My Setup :P

Spoiler

Skylake: I7-6700|MSI B150 GAMING M3|16GB GSKILL RIPJAWS V|R9 280X (WILL BE 1070)|CRUCIAL MX300 + WD BLACK 1TB

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×