Jump to content

R9 380 crossfire, is it worth it?

I have recently added a second R9 380 to my build and so this review will be about whether having a crossfire setup with this card is worth it.

to start off, I know there are heaps of card that could top this setup, but I couldn't be bothered selling the card so I just went along with it and got a second card. 

starting with benchmarks, my crossfire setup almost doubled my single card setup score and fps in an artificial benchmark. I haven't played any games with this setup but only crossfire supported titles would see a benefit from this setup. The average FPS Achieved from a single card was about 40 FPS in the benchmark, but the crossfire setup saw it jump to 78.5 FPS.

The max and min FPS are: 

Single card- min 22 and max 78

Crossfire- min 36 and max 143

The total cost of both graphics cards comes to a total of $600 exactly. Now most people would be shouting at me about the fact that I should've gotten a GTX 980 TI

but... PC part prices in Australia are outlandish compared to the United States, with GTX 1080s costing up to $1100 and the cheapest being $800 which is never in stock. So considering all that, my build is still good value in Australia (the same build as I have would've cost about a few hundred dollars more if purchased from a different store). 

so, after all, that stuff, the verdict, is it worth it?

short answer; depends. it depends on your individual situation and what you have and are willing to pay for. if you already have a card but you are willing to sell and upgrade, great get a 1070 but if you were in my situation then this is probably one of the best ways to go. I get that it mightn't be the best but at least it gets you a performance increase without selling a thing. 

(btw this is my first review so plz don't hate if its bad)

I suck a typing, preparw for typos.

Desktop

CPU: Ryzen 7 3700x MOBO: MSI X570-A Pro RAM: 32 GB Corsair DDR4

GPUS: Gigabyte GTX 1660ti OC 6G  CASE: Corsair Carbide 100R STORAGE: Samsung Evo 960 500GB, Crucial P1 M.2 NVME 1TB   PSU: Corsair CX550M CPU COOLER: Corsair H100x

 

LAPTOP

Apple Macbook Pro 13 M1 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Nice I would do that with my 380X but I'm getting a (used) 390 soon so I might CF that.

Edited by ShadowTechXTS

✨PC Specs✨

AMD Ryzen 7 3800X | MSI MPG B550 Gaming Plus | 16GB Team T-Force 3400MHz | Zotac GTX 1080 AMP EXTREME

BeQuiet Dark Rock Pro 4 Samsung 850 EVO 250GB | NZXT 750W | Phanteks Eclipse P400A

Extras: ASUS Zephyrus G14 (2021) | OnePlus 7 Pro | Fully restored Robosapien V2, Omnibot 2000, Omnibot 5402

Link to comment
Share on other sites

Link to post
Share on other sites

No, it isn't worth it. I wouldn't bother crossfiring ANY card from AMD right now.

 

And I would only SLI the top-end nVidia card too.

 

Multi-GPU is currently crap, doesn't work in DX12/Vulkan for the most part either, when it DOES work for SLI it has awful scaling compared to a couple years ago, and and Crossfire both doesn't work in windowed mode and has an annoyingly low compatibility rate with new games (2015 and onward).

 

I like multi-GPU but it's in an extremely bad situation right now. If it works great for you, then cool. Otherwise, I'd take a single card if it's decently stronger. 30% or more over single GPU I'll take it. 25% or less I would take multi-GPU.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

Crossfire/SLI is great for benchmarkers because they don't have to share their experience with the game. It's easy to ignore the issues of multiple GPUs while benchmarking until you actually have to plan through a full game and live with the issues. I ranted about this the other night. Multiple video cards is nearly a 20 year old technology from 3Dfx right before Nvidia swallowed them. It has gotten worse and worse over the years. It's pretty safe to assume it's a dead technology. I hope I'm wrong, but I don't think I am... not with the current climate. 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 13/08/2016 at 3:39 AM, D2ultima said:

No, it isn't worth it. I wouldn't bother crossfiring ANY card from AMD right now.

 

And I would only SLI the top-end nVidia card too.

 

Multi-GPU is currently crap, doesn't work in DX12/Vulkan for the most part either, when it DOES work for SLI it has awful scaling compared to a couple years ago, and and Crossfire both doesn't work in windowed mode and has an annoyingly low compatibility rate with new games (2015 and onward).

 

I like multi-GPU but it's in an extremely bad situation right now. If it works great for you, then cool. Otherwise, I'd take a single card if it's decently stronger. 30% or more over single GPU I'll take it. 25% or less I would take multi-GPU.

How much did nvidia pay you? 

On 13/08/2016 at 7:21 AM, JohnT said:

Crossfire/SLI is great for benchmarkers because they don't have to share their experience with the game. It's easy to ignore the issues of multiple GPUs while benchmarking until you actually have to plan through a full game and live with the issues. I ranted about this the other night. Multiple video cards is nearly a 20 year old technology from 3Dfx right before Nvidia swallowed them. It has gotten worse and worse over the years. It's pretty safe to assume it's a dead technology. I hope I'm wrong, but I don't think I am... not with the current climate. 

Lol

Rig Specs:

AMD Threadripper 5990WX@4.8Ghz

Asus Zenith III Extreme

Asrock OC Formula 7970XTX Quadfire

G.Skill Ripheartout X OC 7000Mhz C28 DDR5 4X16GB  

Super Flower Power Leadex 2000W Psu's X2

Harrynowl's 775/771 OC and mod guide: http://linustechtips.com/main/topic/232325-lga775-core2duo-core2quad-overclocking-guide/ http://linustechtips.com/main/topic/365998-mod-lga771-to-lga775-cpu-modification-tutorial/

ProKoN haswell/DC OC guide: http://linustechtips.com/main/topic/41234-intel-haswell-4670k-4770k-overclocking-guide/

 

"desperate for just a bit more money to watercool, the titan x would be thankful" Carter -2016

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Jumper118 said:

How much did nvidia pay you? 

Nothing. I hate that shitty company. Doesn't mean I'm not going to give the facts as they are. The only reason I would "SLI the top end nVidia card" and "not Crossfire a single AMD card" is because AMD's best card is about the level of a 1070, give or take.

 

And the only reason I would "SLI the top end nVidia card" at all, is because:

- It works in Windowed Mode, unlike Crossfire.

- It can be forced with nVidia Control Panel, unlike what AMD can do.

- SLI Profiles can be tweaked, and specific profiles can be forced and tweaked on a bit-by-bit basis with nVidia Profile Inspector, unlike what AMD has.

 

If those above reasons did not exist, I wouldn't even consider going multi-GPU at all.

 

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, D2ultima said:

Nothing. I hate that shitty company. Doesn't mean I'm not going to give the facts as they are. The only reason I would "SLI the top end nVidia card" and "not Crossfire a single AMD card" is because AMD's best card is about the level of a 1070, give or take.

 

And the only reason I would "SLI the top end nVidia card" at all, is because:

- It works in Windowed Mode, unlike Crossfire.

- It can be forced with nVidia Control Panel, unlike what AMD can do.

- SLI Profiles can be tweaked, and specific profiles can be forced and tweaked on a bit-by-bit basis with nVidia Profile Inspector, unlike what AMD has.

 

If those above reasons did not exist, I wouldn't even consider going multi-GPU at all.

 

CF scales 100% whereas most of the time sli scales 60-80% per card. idk why you would run anything in window mode. You can also crossfire logic cards whereas with nvidia they all have to be exactly the same.

Rig Specs:

AMD Threadripper 5990WX@4.8Ghz

Asus Zenith III Extreme

Asrock OC Formula 7970XTX Quadfire

G.Skill Ripheartout X OC 7000Mhz C28 DDR5 4X16GB  

Super Flower Power Leadex 2000W Psu's X2

Harrynowl's 775/771 OC and mod guide: http://linustechtips.com/main/topic/232325-lga775-core2duo-core2quad-overclocking-guide/ http://linustechtips.com/main/topic/365998-mod-lga771-to-lga775-cpu-modification-tutorial/

ProKoN haswell/DC OC guide: http://linustechtips.com/main/topic/41234-intel-haswell-4670k-4770k-overclocking-guide/

 

"desperate for just a bit more money to watercool, the titan x would be thankful" Carter -2016

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Jumper118 said:

CF scales 100% whereas most of the time sli scales 60-80% per card. idk why you would run anything in window mode.

Crossfire, in the ridiculously low amount of games it works in especially lately, scales better than SLI. Not going to deny that. SLI however should scale around 85-95% where crossfire scales around 90-101%. They can easily overlap. And I still can't force Crossfire in any game that doesn't have an official AMD profile... which is the whole point. AND their profile list is smaller than nVidia's list from the get-go.

 

And if you don't ever run windowed, that's fine for you. But I multi-task too much, and windowed modes are godsend for people like streamers who have to alt tab a lot when some games just do not like alt tabbing from fullscreen.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, D2ultima said:

Crossfire, in the ridiculously low amount of games it works in especially lately, scales better than SLI. Not going to deny that. SLI however should scale around 85-95% where crossfire scales around 90-101%. They can easily overlap. And I still can't force Crossfire in any game that doesn't have an official AMD profile... which is the whole point. AND their profile list is smaller than nVidia's list from the get-go.

 

And if you don't ever run windowed, that's fine for you. But I multi-task too much, and windowed modes are godsend for people like streamers who have to alt tab a lot when some games just do not like alt tabbing from fullscreen.

I don't stream or play any newly realised games. I think for the op crossfire will work fine and it may have been easier.  Also 2 cards looks badass even when not running in sli or cf.  I don't have my 970's in my main rig in sli and the bridge isn't even on. 

Rig Specs:

AMD Threadripper 5990WX@4.8Ghz

Asus Zenith III Extreme

Asrock OC Formula 7970XTX Quadfire

G.Skill Ripheartout X OC 7000Mhz C28 DDR5 4X16GB  

Super Flower Power Leadex 2000W Psu's X2

Harrynowl's 775/771 OC and mod guide: http://linustechtips.com/main/topic/232325-lga775-core2duo-core2quad-overclocking-guide/ http://linustechtips.com/main/topic/365998-mod-lga771-to-lga775-cpu-modification-tutorial/

ProKoN haswell/DC OC guide: http://linustechtips.com/main/topic/41234-intel-haswell-4670k-4770k-overclocking-guide/

 

"desperate for just a bit more money to watercool, the titan x would be thankful" Carter -2016

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Jumper118 said:

I don't stream or play any newly realised games. I think for the op crossfire will work fine and it may have been easier.  Also 2 cards looks badass even when not running in sli or cf.  I don't have my 970's in my main rig in sli and the bridge isn't even on. 

Sure. Like I said, that's fine for you. Newer games however (mid 2014 onward) have been extremely multi-GPU-unfriendly. What works doesn't scale well and most don't work. And again, for AMD, which is reliant on both fullscreen and crossfire profiles in the driver, it's even more undesirable. Then consider their strongest card is midrange and their newest card is entry-level gaming in terms of strength? Why would anybody buy two of them now? Like I said. Maybe if Vega comes out and we get power anywhere inbetween a 1080 and Titan X Pascal, then I'd say there is a card one could crossfire. But right now? You're simply flat out better sticking with one and getting two stronger ones later this year or early next year, or flat out buying a stronger nVidia card.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, D2ultima said:

Sure. Like I said, that's fine for you. Newer games however (mid 2014 onward) have been extremely multi-GPU-unfriendly. What works doesn't scale well and most don't work. And again, for AMD, which is reliant on both fullscreen and crossfire profiles in the driver, it's even more undesirable. Then consider their strongest card is midrange and their newest card is entry-level gaming in terms of strength? Why would anybody buy two of them now? Like I said. Maybe if Vega comes out and we get power anywhere inbetween a 1080 and Titan X Pascal, then I'd say there is a card one could crossfire. But right now? You're simply flat out better sticking with one and getting two stronger ones later this year or early next year, or flat out buying a stronger nVidia card.

I would agree that if you buy a brand new card that sli or cf of lower end cards may not be a good option, but for a lot of people they cf or sli with a card that they already have. For example if I have a single 390 or 970 it's much cheaper to buy another than it is to sell it and buy a new card with the same performance. 

Rig Specs:

AMD Threadripper 5990WX@4.8Ghz

Asus Zenith III Extreme

Asrock OC Formula 7970XTX Quadfire

G.Skill Ripheartout X OC 7000Mhz C28 DDR5 4X16GB  

Super Flower Power Leadex 2000W Psu's X2

Harrynowl's 775/771 OC and mod guide: http://linustechtips.com/main/topic/232325-lga775-core2duo-core2quad-overclocking-guide/ http://linustechtips.com/main/topic/365998-mod-lga771-to-lga775-cpu-modification-tutorial/

ProKoN haswell/DC OC guide: http://linustechtips.com/main/topic/41234-intel-haswell-4670k-4770k-overclocking-guide/

 

"desperate for just a bit more money to watercool, the titan x would be thankful" Carter -2016

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Jumper118 said:

I would agree that if you buy a brand new card that sli or cf of lower end cards may not be a good option, but for a lot of people they cf or sli with a card that they already have. For example if I have a single 390 or 970 it's much cheaper to buy another than it is to sell it and buy a new card with the same performance. 

Yeah, except as I said, the number of games you'd see that card give you actual decent use out of is low.

 

Hence, even if you'd get 50% extra performance off the bat from a higher end card within your budget (of selling your old card included) I would suggest that over multi-GPU.

 

Because here's what. EVERY SINGLE Unreal Engine 4 and Unity Engine 3 through 5 game will never use Crossfire by default. Only about two use SLI via official profiles: Daylight and the Vanishing of Ethan Carter. Most indie titles won't touch them. Many AAA titles don't care enough to bother; once the console version works they're set and PC can salt. And it seems to have AFR-unfriendly rendering techniques gaining popularity among said developers as well.

 

I honestly gather that to a noob buying SLI (not using NVPI, or even forcing AFR2 in NCP which isn't that useful these days) or to anybody, even a cream-of-the-crop enthusiast, buying CrossfireX, the number of games post mid 2014 that will PROPERLY (shit like THIS HERE from Titanfall 1 does not count as "properly") use multi-GPU is easily under 30%. I could raise that number to over 50% considering NVPI, but that INSTANTLY locks me into nVidia only, and thus I cannot even see AMD as a choice.

 

Do note I've done a good deal of research on this stuff though. I'm the author of the SLI information guide stickied on these very forums, so my viewpoint is coming from a feature-set and "big-picture" kind of viewpoint. It's definitely true that there are some edge cases that make sense, in every scenario. So as I said, maybe in your case it's fine. Maybe in OP's case he happens to play enough games that do support it and little of the ones that don't. But my recommendation still stands. Grabbing multi-GPU as a nub or an AMD user is acknowledging that if you play newer titles, you're seeing your second card used about 30% of the time. Grabbing SLI as a NVPI user entails that number climbing quite a bit, but it'll never reach the "if I turned on a game it generally used SLI or didn't need more than one video card anyway like Terraria or Beat Hazard or Binding of Isaac" that I had when I got SLI for the first time in 2013.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, D2ultima said:

Yeah, except as I said, the number of games you'd see that card give you actual decent use out of is low.

 

Hence, even if you'd get 50% extra performance off the bat from a higher end card within your budget (of selling your old card included) I would suggest that over multi-GPU.

 

Because here's what. EVERY SINGLE Unreal Engine 4 and Unity Engine 3 through 5 game will never use Crossfire by default. Only about two use SLI via official profiles: Daylight and the Vanishing of Ethan Carter. Most indie titles won't touch them. Many AAA titles don't care enough to bother; once the console version works they're set and PC can salt. And it seems to have AFR-unfriendly rendering techniques gaining popularity among said developers as well.

 

I honestly gather that to a noob buying SLI (not using NVPI, or even forcing AFR2 in NCP which isn't that useful these days) or to anybody, even a cream-of-the-crop enthusiast, buying CrossfireX, the number of games post mid 2014 that will PROPERLY (shit like THIS HERE from Titanfall 1 does not count as "properly") use multi-GPU is easily under 30%. I could raise that number to over 50% considering NVPI, but that INSTANTLY locks me into nVidia only, and thus I cannot even see AMD as a choice.

 

Do note I've done a good deal of research on this stuff though. I'm the author of the SLI information guide stickied on these very forums, so my viewpoint is coming from a feature-set and "big-picture" kind of viewpoint. It's definitely true that there are some edge cases that make sense, in every scenario. So as I said, maybe in your case it's fine. Maybe in OP's case he happens to play enough games that do support it and little of the ones that don't. But my recommendation still stands. Grabbing multi-GPU as a nub or an AMD user is acknowledging that if you play newer titles, you're seeing your second card used about 30% of the time. Grabbing SLI as a NVPI user entails that number climbing quite a bit, but it'll never reach the "if I turned on a game it generally used SLI or didn't need more than one video card anyway like Terraria or Beat Hazard or Binding of Isaac" that I had when I got SLI for the first time in 2013.

thats exactly what they want you to do though. the less people buy mutli card setups the less support they will get and they will get even worse. this means thats all games will end up not being able to utilise the full potential of a monster pc which makes me sad. it also means people on a budget who just want that extra performance but cant take that extra expense of a newer gen card cant do it. its very anti consumer.  

Rig Specs:

AMD Threadripper 5990WX@4.8Ghz

Asus Zenith III Extreme

Asrock OC Formula 7970XTX Quadfire

G.Skill Ripheartout X OC 7000Mhz C28 DDR5 4X16GB  

Super Flower Power Leadex 2000W Psu's X2

Harrynowl's 775/771 OC and mod guide: http://linustechtips.com/main/topic/232325-lga775-core2duo-core2quad-overclocking-guide/ http://linustechtips.com/main/topic/365998-mod-lga771-to-lga775-cpu-modification-tutorial/

ProKoN haswell/DC OC guide: http://linustechtips.com/main/topic/41234-intel-haswell-4670k-4770k-overclocking-guide/

 

"desperate for just a bit more money to watercool, the titan x would be thankful" Carter -2016

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Jumper118 said:

thats exactly what they want you to do though. the less people buy mutli card setups the less support they will get and they will get even worse. this means thats all games will end up not being able to utilise the full potential of a monster pc which makes me sad. it also means people on a budget who just want that extra performance but cant take that extra expense of a newer gen card cant do it. its very anti consumer.  

I never said it wasn't. But in the same breath I can't tell someone to spend extra money on something I know will be mostly worthless. It's like telling someone to go buy an i7-5820K to primarily play Killing Floor 2, which uses 1 CPU core heavily per GPU used and four CPU cores max, and ignores hyperthreading. Or telling someone to buy a 1500W PSU for an i5-6500 and GTX 980. It's generally wasting money.

 

Besides, there isn't a GPU on the market that can't have 30%+ better performance in a single card for less than the budget for a new current card + selling the old current card. The 1070 comes close though, but still falls into this bracket.

 

Honestly though, the best thing we could do is harass developers for SLI/CrossfireX support, and complain when things don't work properly. Because it's honestly their own faults. It isn't anybody's fault but theirs and/or the publishers. They chose the engine. They chose what rendering techniques to use. They chose the optimization. They chose to work with or not work with AMD & nVidia to bring multi-GPU support to the game (officially). But even then, I'd still say not to get multi-GPU for almost everyone. I just can't. Especially with the way nVidia has been cutting down on SLI benefits since Maxwell generation has come out.

I have finally moved to a desktop. Also my guides are outdated as hell.

 

THE INFORMATION GUIDES: SLI INFORMATION || vRAM INFORMATION || MOBILE i7 CPU INFORMATION || Maybe more someday

Link to comment
Share on other sites

Link to post
Share on other sites

  • 1 year later...

Form the standpoint of today, I think that it is a good value in general, especially if you get them used. Although I am not able to answer that from a 2016 standpoint because I am not sure what the price of that card was in that year (and it has changed from the time of your original post).

Hope this information post was helpful  ?,

        @Boomwebsearch 

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 28/09/2017 at 6:48 AM, Boomwebsearch said:

Form the standpoint of today, I think that it is a good value in general, especially if you get them used. Although I am not able to answer that from a 2016 standpoint because I am not sure what the price of that card was in that year (and it has changed from the time of your original post).

Yes used, now, its absolutely amazing value for money especially in applications that use Open GL and some slightly older games. However in 2016 it was bad value, I'll just say that. I could've gotten a beefier card for the same price and have better support for games and fewer issues with screen tearing and etc. However the cards are still running strong and manage to pull 60+ fps 1080p ultra on games I play. It performed well in the artificials and an improvement was noticed in games that supported crossfire. However the real kicker is that my secondary card only sees usage whilst I'm either 3D modelling/Content creating/playing CF supported games. So If at any chance you could, I'd still highly recommend grabbing a higher end single card. 

(good for mining tho)

 

 

 

 

 

 

 

(dont kill me)

I suck a typing, preparw for typos.

Desktop

CPU: Ryzen 7 3700x MOBO: MSI X570-A Pro RAM: 32 GB Corsair DDR4

GPUS: Gigabyte GTX 1660ti OC 6G  CASE: Corsair Carbide 100R STORAGE: Samsung Evo 960 500GB, Crucial P1 M.2 NVME 1TB   PSU: Corsair CX550M CPU COOLER: Corsair H100x

 

LAPTOP

Apple Macbook Pro 13 M1 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 years later...

I know this is an old thread but I had to chime in. I just ordered a new amd 5700xt to replace my 380x crossfire setup. I have been going through a few games(in 2019 with the 380x crossfire) to see what my performance gains will be when the new card arrives and I do it all over again with the 5700xt. I have been resting Gears of War 5, Resident evil 1 & 2 remakes, red dead redemption 2, prey, dishonored 2 and Warhammer vermintide 2. I can say when u disable crossfire there is a pretty good drop in performance in all games. I'm not gonna post numbers but they range from 7-35fps. I tweak all games in the amd software and can disable xfire from there. So yes adding a second card is a good performance boost even in 2019. I'll come back on here and let u guys know what the difference is between the 2 380x's and the 5700xt. I'm not expecting that much honestly. However there are a few games that I just need more fps. Setup as fillows:

Intel 4790k @ 4.7ghz

16gb ddr3 1866 g.skill(blue) 

Msi Z97 gaming 5 mobo

1 ocz vertex 4 120gb for windows 

2x Samsung Evo 240gb ssd

1 Samsung  Evo 500gb ssd

1 tb wd black for storage (in hdd dock) 

1 blu-ray burner

1x Samsung Evo 500gb in hotswap dock

Coolermaster storm Stryker case, all fans replaced with noctua fans and 1 added at bottom of case. 

Coolermaster v-8 cooler with stock fan replaced with noctua rad fan. 

Soundblaster z

Denon 1705 avr

Polk 5.1 (rm-202 center, rm-101 for thr other 4 plus polk powered sub) 

Headphones - Logitech g501, sennheiser momentum over ear, bose qc2, 1-more triple-drivers. 

Track IR4 with hat I got in Montana on a fly fishing trip. 

Saitek X-52 pro

Ch-Pro flight pedals

Playstation 3 controller

BenQ XL2730 27" 1440p 144hz freesync monitor x2. 

A decent system built in 2014 and upgraded every now and again. 

 

 

20160913_091514.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×