Jump to content

Jrasero

Member
  • Posts

    679
  • Joined

  • Last visited

Everything posted by Jrasero

  1. yeah for the parts I want it's like $1300 before tax so, If I am not missing out on much with the 7800X3D sounds like sticking with my 5800X3D and AM4 sounds solid since I game at 4K
  2. That's where I am leaning. Are there still memory issues and chips burning up on AM5?
  3. Currently planning to swap cases when the NCase M1Evo is shipped but got me wondering if I should "upgrade" to AM5 7800X3D from a AM4 5800X3D? I game at 4K w/ a RTX 4090 and usually play mostly sports games
  4. I think the "why not" argument is somewhat stupid since in theory why not just get a RTX 4090 every time. Yeah if you are considering a RX 7900 XT "just" pay $100 more for the RX 7900 XTX which is impossible to find and if you can get a 7900 XTX why not pay $200 more and get the RTX 4080, and if you can afford that why not just get the $400+ more RTX 4090 At a certain point just get whatever is available, is in budget, and makes sense for your needs. Yes this gen "value" wise is complete shit but that's not too much of a surprise. If I hadn't gotten an AIB RTX 4090 at launch I probably just would have went with a RTX 4080 FE, yeah value wise it sucks but it's still a very good 4K card that at bare min would still get 60 FPS still in the most demanding games like Cyber Punk 2077. The one area where the 4090 dominates everyone is Ray Tracing but you got to ask yourself does that matter to you?
  5. Case swap for 2023. SFF Time N-ATX V2 https://pcpartpicker.com/b/MNp2FT
  6. Thanks for replies all. To reiterate this move is 100% for gaming. The most I use this computer otherwise is for Office and web browsing, so I understand that the 5900X would be better at productivity but I really only game or do things a Chromebook could accomplish
  7. I currently running a B550i mobo w/ a 5900X, DDR4 3600 RAM, and a RTX 4090 and play at 4K. I only use the computer for productivity and mostly gaming Would going to a 5800X3D be a upgrade, lateral move, or downgrade? I can get a 5800X3D shipped for $350 and sell my 5900X for at least $250, thus swap is possibly just $100. Reason I am thinking of doing this is to breathe some life into my AM4 setup and push out a total MOBO/CPU/RAM upgrade another 1-2 years
  8. With the NVIDIA adapter I unplugged twice and examined the connectors with no sign of melting. I am holding off on unplugging the MODDIY adapter to check for any melting since I don't want to constantly be plugging and unplugging the 12VHPWR. I am going to give it a few more hours of game play and check back then Cablemod would have been my first choice but they don't make a universal adapter and they don't make a 12VHPWR cable for Silverstone yet and their 180 adapter was delayed. With this said, Cablemod's while probably 100% fine, are also brand new to the market and people are only now just getting their 12VHPWR cables. I don't understand why people question MODDIY quality but just assume CM's will work, it's very possible both of these companies offer a better solution that the included NVIDIA adapter. At this point everyone is guessing what is causing the melting pins and no one really knows why, but IMO MODDIY's adapter in theory addresses all the possible problems if NVIDIA's adapter truly is the problem since NVIDIA's adapter is uni-sleeved and if you bend this type of cable which is so stiff and hard to bend, the top row of the wires will receive huge pressure and create bad contact, or the pin may even be pulled out. MODDIY cables are ultra flexible and individually sleeved so there is a lot less pressure on the pins, plus MODDIY uses a single split seam.
  9. I got my Gigabyte OC RTX 4090 a little over a week ago and put it in an ASUS Prime AP201 MATX case w/ a Silverstone SX1000 PSU using the NVIDIA adapter. As you can see the adapter was bent to the side. I only used three of the 8-pin PCIe cables and never overclocked the GPU. I gamed maybe a total of 4 hours on the card. The cable did slightly hit the side panel. I did check the pins twice looking for any melting, but as you can see nothing melted. Gigabyte OC RTX 4090 w/ NVIDIA Adapter With this said I might have gotten one of the "good" adapters or just got lucky with my bend. Either way, I didn't want to chance it so I bought a 12VHPWR adapter from Amazon (SIRLYR) and a triple 8 Pin to 12VHPWR 16-pin power cable 16" made specifically for the Silverstone PSU, and a triple 8 Pin to 12VHPWR 16-pin adapter cable 4" from MODDIY. I decided to go with the MODDIY adapter since then I wouldn't have to take out my GPU and redo all my cables since the case is a single chamber one. I got the ultra-soft silicone wire cables and man are these things really flexible while still not feeling cheap. Since these cables are very flexible and individually sleeved I was able to bend the wires more without the fear of pulling out the pins, thus I am able to seat the side panel with little to no tension on the connector. MODDIY also uses a single split seam. Gigabyte OC RTX 4090 w/ MODDIY Adapter I did a little bit of gaming with zero issues. Time will tell if this adapter holds up or if the NVIDIA adapters were the problem. I ordered these items from MODDIY on 10-26 and paid $39 for the express FEDEX shipping and got them today 11-1.
  10. I agree with the forward thinking, but I think in the end their clearly is a problem. Hey as of now a few RTX 4090 have this problem and the best case scenario is that we find out it's because either user error and or because a defective adapter, but the worse case scenario is that 12VHPWR simply isn't ready for prime time and presents way too many problems. I think were this gets tricky is when the RTX 4080 and RTX 4070 launches and if this stuff still is occurring.
  11. Just finished up my RTX 4090 build. Pardon the poor cable management which I got to work on
  12. Very common. I have had three. I got lucky and scored my 1st RTX 3090 FE through BB and it had horrible coil whine and returned it but then I got lucky again and got another RTX 3090 FE with probably equal coil whine and eventually returned that. I got lucky a third time and got another RTX 3090 FE again from BB and this time the coil whine was less but still very present. Eventually after 6+ months the coil whine has died down.
  13. I got lucky. I forgot the launch was on 10-12 and was 10 minutes late. I kept refreshing Newegg and Best Buy and kept having all the cards go out of stock. Finally the Gigabyte OC one came into stock and I grabbed one not even thinking about payment. After a minute or two I realized I'd rather use a different payment so I made a whole other other and only after my 2nd order was confirmed did I cancel my 1st order. The card staid in stock for maybe 10-15 minutes I live by Microcenter and it seemed that the cards lasted 1-2 days but everyone who was there at launch got a card. It did seem that most people got the Gigabyte OC or some form of the ASUS Tuft or ROG. I personally would have loved a FE, but it seems like these are having the most problems. Knock on wood my RTX 4090 seems fine, so wobbling fans, stable temps, no coil whine, and great performance
  14. C2 42" hands down. I have had Alienware's IPS 34" AW and while they make a very good monitor they are inherently very gamer centric in terms of aesthetics and secondly they are only a monitor. While for some 34" is huge or big enough, I found 34" in terms of multitasking to be a tad limiting for how many screens I have open. I was using a C1 48" as my monitor and that handled all my tabs and screens perfectly but a lot of the content fell outside my sightline. The 42" is perfect IMO and I can see every inch of the TV/monitor with turning. Even if I hated the monitor I simple could repurpose it and use it as a TV in our spare room, but with the Alienware it's simply a monitor.
  15. Overpriced compared to the older outgoing C1, maybe since it's smaller and performance wise there is no difference but not really overpriced compared to something like the AW3423DW
  16. I have both and tried both as monitors and hands down the 42" is better suited as monitor due to it's size and if you plan on using the stock stands the C2 42"s stand is much lighter and half the depth. I just found a 48" screen was a tad too big. It was a couple inches too tall and too wide to the point that my sightline from my desk (3') couldn't pickup certain parts of the TV.
  17. I would say neither IMO since I think 48" is too big for a monitor. I had a C1 48" as my main monitor and while I thought it was great, it simply was too big so I moved it it to our bedroom and just bought the C2 42" for my desk. If you are fine with 48" yeah the C1 48" at $999 is a deal, but I would say you need some sort of huge desk that is preferably height adjustable and you should be wall mounting or buying some kind of stand and at that point that might add so much money to the C1 that just buying a 48GQ900 makes more sense
  18. Performance wise there are a couple sites that show that brightness wise the C1 and C2 are the same. The only difference that the C2 has is the newer chip so the UI is much more snappy. I had a C1 48" as my monitor but just moved it to our bedroom and replaced it with a C2 42" and besides the size, stand, and UI I can't tell the difference. As per a sound bar I think the speakers on the C1 are pretty good since they bounce off the huge metal stand. I do however have a Sony HT-X8500 sound bar for our Sony AJ80 65" in our living room. For $300 the Sony HT-X8500 is the only SB that is eARC w/ Dolby Atomos w/ a build in subwoofer, granted it is a $300 SB so do remember you kind of get what you pay for so bass, soundstage, and clarity aren't amazing but still a clear upgrade from the TV's speakers
  19. I think it's worth it but with some caveats. One being, is your GF the type that has a card and only upgrades it until it brakes or can't run what she wants? If so, then buy now. Prices and stock eventually will get better but we are getting closer and closer to a Q4 next gen release. If you don't want to wait, or don't want to take your chances in trying to obtain one, then again buying now makes sense. If you are looking at UW then yeah then the RTX 3070, RTX 3070 Ti, RTX 3080 and so on are all great but at least in America the Radeon 6800 and Radeon 6800 XT are more readily available and cheaper. I run a 6800 XT in my secondary system on a 3440x1440 UW and besides odd games like Cyber Punk 2077 w/ RT, everything runs great on it Personally you could wait, but how many weeks, months are you willing to wait until prices "normalize". I am not saying $1300 Euro for a RTX 3080 is right, just there is a loss opportunity cost in not being able to play whatever in the meantime
  20. Car for most will 100% be more of priority. Good luck though finding a new or used car, that process isn't as bad as getting a GPU but it's pretty hard. I live in the NY tri-state area and maybe only a handful of dealers weren't marking up cars above MSRP. Only a few had any kind of stock that wasn't "due in" or had to be preordered. Buying a used car while less than a new car is one of the worse values since they are practically going for what new ones are but with thousands of miles on them. Also, leasing while not dead has zero incentives.
  21. I mean it depends on availability IMO. For me in the US, NY Tri-State area I have four Micro Centers around me and I can find Radeon 6800 XT for $1200-$1300 pretty easy and I can find Radeon 6900 XT for $1500-$1600 very easily. In terms of RTX cards, sometimes a RTX 3070 Ti pops up but the RTX 3080 and RTX 3090 rarely are in stock. The one card that also occasionally pops up is the RTX 3080 Ti but that goes for $1800+. If you play at 1440P, I think the Radeon 6800 XT card is a good choice especially if you can find one for $1200 or bellow. The lack of some form of super sampling and the inferior ray tracing is very noticeable in games like Cyber Punk 2077, but if you don't need DLSS and or willing to wait for AMD's version and don't care to bomb your FPS with ray tracing then the 6800 XT can work for you. At the end of the day we are slowly creeping to next gen cards so if I were in your position I would just keep using your current card. This was when RTX 4000 or Radeon 7000 drops you can get whatever you want or you can buy whatever card scalped.
  22. Used to have an EVGA RTX 3080 XC3 Ultra but now have a ASUS TUF Radeon 6800 XT and if prices were the same, heck even normalized I would say the RTX 3080. If upscaling and ray tracing matters to you than the RTX 3080 is a no brainer. Also Nvidia's encoder and software suite are nice bonuses. Really the Radeon 6800 XT is a really nice plan B card. Yes it's a slightly better 1440P card but worse at 4k. Unless you can make use of the 16GB of memory, or can find a Radeon 6800 XT for substantially cheaper than a RTX 3080, the RTX 3080 is a clear 1A option. With this said I don't dislike my Radeon 6800 XT but it's like getting Burger King when you wanted McDonalds.
  23. Just picked up an openbox Powercolor RX 5800 XT Red Dragon for 15% off from Microcenter. I am still waiting for my CPU and MBO to be delivered so I didn't test the GPU yet but as soon as I got it out of the box It had this almost body odor smell. It doesn't really smell "burnt" per se but again more of a BO/cat odor. I peeled off the Microcenter labels from the box and the 1st label had a sticker from 11-8-21 saying RTV "Overheating". A 2nd sticker had a sticker with a processed return saying it was tested on 11-28-21. When I bought this in-store the salesperson did mention that the item was sent back to Powercolor already. I did pass on the extended warranty This card has a massive heatsink and 90% metal construction, so this wouldn't be burnt plastic, but something on the PCB? My receipt does say that the limited 2 year manufacture is still there. Any chance this card still works and I wont have to return it.
×