Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

StealthArsenal

Member
  • Content Count

    374
  • Joined

  • Last visited

Awards


This user doesn't have any awards

1 Follower

About StealthArsenal

  • Title
    Junior Member
  • Birthday 1987-01-26

Profile Information

  • Gender
    Male
  • Occupation
    NJ Professional Engineer (Civil)

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I have Spotify prem, but this is no different than Sirius XM Family plan for online streaming. The only difference is they don't care where you listen to it. Now I will say sirius is pretty trash these days, but if they have not tried this yet, than I don't think Spotify is going to have a good time.
  2. The Pro NVME is back in boot order. Must have been a seating issue there. The drives still not showing up are my 970 evo plus and 1tb WB Black. The nvme slot not working is M2Q-32G. I took a look in the manual, and I don't see why it's not working. I did try pulling a sound card and disconnect another SSD to see if that did anything, but didn't seem to. Now I know for a fact the nvme drive works as I just had a fresh Win 10 install on it earlier today for testing purposes. Thanks
  3. Hey guys, I've got a bit of a question and I guess it took me until now to figure out I had this question. So we are talking about the system in my signature (7820X and X299 Aorus Ultra Gaming). I noticed that a few of my drives are not being detected and I wonder if it's because lanes are not available. So right now, I have my main boot drive as a Samsung 970 EVO NVME. I have three additional SSDs (all three are Samsung SSD) and a 2TB WD Black drive that are being detected. With that said, I have another WD 1TB black in the system, a Samsung 960 Pro drive and a Samsung 970 Evo Plus drive that are not being detected by disk management. I know the drives work so that's not a problem. I moved some SATA cables around and nothing. This has led me to believe that I just don't have the PCI lanes available for these drives. Am I correct with this assumption or do I need to enable something? Thanks
  4. @bignaz I'm going to keep the X299 for now. Going to wait and see what comes down the pipeline.
  5. Alrighty, so I have been doing some testing today. Definitely getting some very interesting results. It's like a catch 22 right now. For clarity, here are my testing rigs: - 7820x vs. 3900X - Aorus X299 Ultra Gaming vs. Crosshar VIII X570 Hero - 32 gb - 3000 Mhz ram (Vengenance) vs. 32gb - 3200 Mhz Ram (Vengenance) - 1080TI FTW 3 vs. 1080TI FTW 3 - Clean install Win 10 Pro vs. Clean install Win 10 Pro I was primarily testing out of the box with stock settings, but I also went ahead and clocked my 7820x up to 4.4 ghz on all cores to better match up with the 3900x which was turbo boosting and staying consistently at 4.3 to 4.4. The 3900X also maxed out at 4.4-4.6 on all cores. All these numbers came from CPUZ and HWMonitor. I did not overclock the 3900x or overclock the mesh on the 7820X. If I look at Cinebench. R15 and R20 are quite a bit faster multicore and a bit faster single core. With R20, we are looking at 7105 multicore and 503 single core on the 3900x vs. 4219 multicore and 435 single core on the 7820x. R15 on the other hand saw 3077 cpu and 198 single core on the 3900x while the 7820x was 1774 cpu and 190 single core. Looking at time spy extreme, I got a CPU score of 4495 on the 7820x. With the 3900x, cpu pulled 6596 with 3200mhz ram. Graphics scores were identical obviously due to the same graphics card. Fire strike ultra had the 3900x physics at 27932 with combined of 6936 while 7820x physics was 21145 with combined of 7071. I also ran Fire strike extreme and the 3900x was at 28015 physics and 6928 combined. The 7820x on the other hand was at 19174 physics and 5191 combined. I tested just two games because I honestly wasn't expecting too much of a difference. Tomb Raider (7820X) - Min FPS = 110, Max FPS = 120, Avg FPS = 119.1 Tomb Raider (3900X) - Min FPS = 106, Max FPS = 120, Avg FPS = 118 Metro Last Light (7820x) - Avg FPS = 97, Max = 175.65, Min FPS = 23.49 Metro Last Light (3900X) - Avg FPS = 100, Max = 185, Min FPS = 28.43 There is still room I suspect to overclock the 7820X more. I could mess with the mesh overclocking and bring the multiplier on the cores up more. I am on water, so I have a bit more headroom from a thermal perspective to work with. I also had the 7820X at 1.25V at the 4.4. Flipping to the 3900X for a moment, as I mention it was clocking up on boost to 4.3-4.4 but was doing so at 1.475V. I suspect I can maybe overclock a bit on the 3900X, but it would appear the 7820X might clock further. I clocked it on the fly without a crash so I think we can go further here. I don't know, I could make the argument to keep the 7820X and return the 3900x or keep it. I could also see about faster ram in the X299 system and hang onto it until the 3950X or find say a 9900X and drop into my X299 board. At this point, I would say it probably is an expensive platform swap for not too much difference. I would probably be better with a 9900K. I do have a 2080TI coming tomorrow, so either system will do fine with it. Not sure, maybe I am looking at this all wrong.
  6. I am still of the opinion that they knew AMD was coming with the 5700/5700XT, knew the kind of performance they were going to put out, knew the pricing would be way more attractive but still went with, we are NVIDIA, we are gods. With two mid range cards, AMD has seemingly knocked NVIDIA on their butts and it's almost like they don't know what to do other than name cards with dumb nomenclature and increase the price for the above reason. Don't get me wrong, I have been an nvidia guy for a long time, but I like competition. If two cards can cause this much ruckus, I hate to see what happens to NVIDIA when lets say a 5800 or 5900 comes out.
  7. Agree 100% with @driftz240. I have not heard or seen a driver issue that is intermittent like that. Definitely sounds like a card issue. I would DDU the driver, pull the card and reseat it, or put it into another PCIe slot and see if the issue persists.
  8. @Princess Luna money really isn't a factor to be honest. I just have principals and the FTW is more than I am willing to spend in comparison to other cards on the market. However, if this was my Corvette or 911 GTS, you bet your life I will drop whatever I need to, to make them faster. Monitor wise, I do want to max out my current 3440 x 1440 screen at least the best I can. I don't have much more overclocking room on my 1080TI. I have considered waiting, but something tells me, the next generation will be even more expensive, which is something I am not too keen on especially given when I got my current TI, it cost me $600 straight from EVGA, so there is that. I can sit here an talk myself into the 2080TI all day long, but functionally going to Microcenter and bringing one home, that might be more difficult as I would have to drive on over. That would seem to indicate that I am not all in on it yet. The other side of the coin here, is that I am big on selling off hardware before it becomes a dime a dozen. For example, I am a huge mountain biker and downhill racer. I keep my bikes a season, two at the most. I want to be able to take my $10,000 race bikes and get at least 5-6K for them. After 2 seasons, that goes to like 2k. I constantly turn over my gear. This just happens to be the first system, I did a said it and forget it on when I built it. @Firewrath9 I was looking at that card. Was kind of up in the air (no pun intended) if I wanted to go back to a water-cooled graphics card. My 980TI's and my founders 1080TI (when I had it) were under water with EK blocks. With the FTW 1080Ti, I went back to air cooling. I don't really have a reason, just like the aesthetics I suppose. So you think the Strix is a better card than the Aorus. That is interesting, not that I am saying you are wrong. I don't know enough about the other manufacturers to pass judgement or say A is better than B is better than C if that makes sense. I have to take a ride to microcenter later to get an NVME drive. I want to bench X299 again my X570 on a new nvme drive with nothing on it and see what I have than screw up all my drives. Maybe I will grab a 2080TI to test with an see what I get over the 1080TI. This is why I love Microcenter.
  9. Hello all, Seems of late I have gotten the bug to play around and build a new system. I decided to go out an grab a 3900X and pit it again my 7820X and see what happens. Aside from that, I am considering an upgrade from my 1080TI FTW3 to a 2080TI. I have been on EVGA FTW or Classified cards for years. Currently, most everyone know the 2080TI FTW3 Ultra is 1400-1499. This got me looking at a few other models. Strix 2080TI https://www.microcenter.com/product/512876/rog-strix-geforce-rtx-2080-ti-overclocked-triple-fan-11gb-gddr6-pcie-video-card Aorus Xtreme 2080TI https://www.microcenter.com/product/601923/aorus-xtreme-geforce-rtx-2080-ti-triple-fan-11gb-gddr6-pcie-video-card Aorus Gaming https://www.microcenter.com/product/511350/gaming-geforce-rtx-2080-ti-overclocked-triple-fan-11gb-gddr6-pcie-video-card Zotac AMP https://www.microcenter.com/product/511353/amp-geforce-rtx-2080-ti-triple-fan-11gb-gddr6-pcie-video-card Zotax I have never used. I have used some Strix cards but they were 970 or 980 Strix. The Aorus/Gigabyte cards I have never used. From a warranty standpoint, I know EVGA is unmatched, but how are the others if a card fails. Is one of these cards significantly better than the other. When I use the term better, I am talking about build quality and stability. I understand I could clock some cards higher, but I do particularly like models with triple fan solutions. I guess you can say, my question stems from just not branching beyond EVGA in the last 10 years or so. As I mentioned, with the pricing of the FTW, I might look elsewhere as long as the quality is good. I will say this, I am looking to push my games in ultra with my 3440 x 1440 monitor in my signature. The 1080TI is no slouch, I am just looking for a bit more performance and the availability to run another 34" curved and add in a 4K panel in the near future. Thanks for any help.
  10. Whoa, didn't have the opportunity to check this during the day today. Definitely no 3-way SLI, but I can go 2 way SLI as I do have another 1080TI it just isn't in the system right now. Now one thing I do have, but have not put in yet, is a third NVME drive. I may retire my Pro drive, but that is something I have not decided upon as of yet. I am half tempted to grab a processor and mobo from Microcenter and put it head to head with my current system and see what I find out.
  11. @_Syn_ Just wanted to clarify one other thing. I was at work when I originally posted this thread. Where I was going with the PCI Express lanes actually was the 24 on the cpu, but the chipset has another 16 usable (20 total) for a 36 usable/44 total.
  12. @_Syn_ You are absolutely right. I wasn't jumping in it right this second. I have a microcenter local, so I can go pick everything up whenever. I guess I seeing x299 as pretty much dead now aside from two outliers. Thanks
  13. @_Syn_ If I load up Civil3D and say externally reference in topography, other drawings. Then let's say I draw an alignment and start using pressure networks and create a profile, I will start utilizing all the core. Same goes for vertical and horizontal curves and profiles. Both Revit and CAD are resources monster. I started at work with a 4 core, 8 thread machine then went to the 8950HK and both programs functions 100 times better. Now I bring my files over to myn desktop and draft because that's smoother and faster yet. I get the consensus behind having the platform already. I'm not 100% saying I'm switching, I'm just thinking maybe from an upgrade stand point, maybe Gen 3 is worth it. I don't really know. Would love to see some additional benchmarks, but what I did see on OC3D a bit ago is the x299 getting smoked (aside from two XE models). I don't mind purchasing the AMD system, but then again, maybe a 2080TI might be nice or another Alienware 34". I don't know. I'll see what I can find in additional benchmarks and make an informed decision. Thinking of possibly selling off my x299 if I do switch. I know what I was thinking, but I had to be realistic and say to myself, there is no way you will use the second system. Thanks
  14. @_Syn_ Thanks for responding. I'm not sure I agree with the single core usage of autocad/revit regardless of what they claim. I am maxing out 8 cores on my 7820X and all six cores on my 8950HK in my work laptop. That's a debate for another time. As far as gaming benchmarks and such, I am significantly behind the 9900K even overclocked on my 7820X. I did see some of what you were referring to, which has me a bit confused I suppose. I was thinking something completely different with lanes, so I apologize about that. For some reason I was thinking 40's on the 3900X. I knew the 7820X was 28, so I should've looked that up before I said anything. I completely agree with the quad channel comment. In my head I was thinking this, but I wanted a confirmation. If I bought the 3900X, I have another system. That was kind of the idea, but I did not mention this. I would move my 7820X into projector gaming status or upgrade a family member who renders out production on a 3700K. I guess the other way to look at it is X299 is going to be a dead socket soon? I dunno. I don't have to make this move, I was just contemplating it. It just seems the AMD systems now are just better and I really want AMD to be good again.
  15. Hello everyone, Been a couple of years now, since I fully rebuilt my system. With that said, I have kept tabs on all the latest and greatest, but just hadn't really wanted to upgrade to anything yet. At least not without seeing AMD's new offering. So with that said, I am considering the switch from my 7820X over to the 3900X. Obviously some of the benchmarks (there aren't a ton with the 7820X in there) I have seen in the many many youtube video yesterday show the 7820X pretty significantly below anything current. This of course I expected. What I am looking to do, is get a bit better gaming performance but also better multi-threaded performance when I am drafting in AutoCAD and/or Revit. Obviously for me, I can sacrifice some single threaded performance a bit from say the 9900K (but am I really). Am I losing much of anything from dropping from quad channel back to dual channel. Obviously, I am gaining 4 cores, significantly more threads, cache and pcie lanes. Logic and everything I have read would tell me that the 3900x is better in everyway to the 7820X and the x299 platform. From a cost perspective, it doesn't seem too logical to grab a higher end X299 cpu. With that out of the way, I was thinking along the lines the 3900X paired with an Aorus Master or Asus Crosshair with 32gbs of 3600mhz Trident Z memory. The rest of the system in my signature would remain intact. This would simply be a cpu, board and ram swap out. I guess what I am specifically asking, is there enough of a performance increase to justify. Thanks Chris
×