Jump to content

BOOM BOX

Member
  • Posts

    24
  • Joined

  • Last visited

Everything posted by BOOM BOX

  1. I would read this, but this would cost me 100, 200, 3 00, maybe even $500 of my time. Sorry but I don't have time to read this half-assed clap back. Like why bother responding if you're not going to actually produce something useful to stand on?
  2. I was actually looking at Switchcraft's M&F connectors on sweetwater but of course, no one puts up reviews to help discern quality from trash but I'll keep Neutrik in mind as I go along. Thank you. I'm sure in about a month or so you'll see the final product somewhere on here or reddit.
  3. Well obviously Shielded is the way to go when talking about audio cabling but I didn't even bother to look at Mogami until you brought them up. Thank you, I'll look into it. Edit: after some consideration and reminding myself that I'm allergic to money Imma go for some of Mogami's wire. Thanks my dude.
  4. Long and short of it is that I'm working on a little project that's going to require me soldering some XLR connections together. The question is simple: do you know of any shielded cables that are worth a damn and I can get in spools?
  5. Remember kids, 52 V power line right next to the 1.7 V data line. Flawless Apple engineering.
  6. Thank you! Now, if you'll excuse me I have to go chuck more money at the deep dark void that is my PC.
  7. Just to make sure we're on the same page, dual rank memory @ 3200 MHz means dick all to 3rd gen Ryzen. That's what I'm gathering from your reply.
  8. I actually didn't know the mem controllers weren't the greatest. That would explain a lot about why I was having so many issues. Didn't quite answer my question but I do appreciate that little tid bit. So allow me to clarify the question a little bit more and I wish had the booklet that my 2600X came with so I could just show you guys. So on the 2000 series there was a difference in how high you could go with your RAMs clock rates depending on whether it was single rank (memory chips on one side of the RAM PCB) or dual rank (memory chips on both sides of the RAM PCB. The question that I'm trying to answer is this: "Does the performance difference between single rank memeory and dual rank memory still exist on Ryzen 3rd gen?" I apologize if my original question was confusing and hopefully rephrasing the question more directly pin points the answer.
  9. So for context I use to use a R5 2600X that used Corsair Vengeance LPX 1 x 16 GB @ 3200 MHz (under clocked to 2133 MHz or something like that) and I could never get the RAM to clock any higher without running into POSTing issues because it was dual rank. With that little bit of context my question is this and this is especially for those who like myself got a R9 3950X. If I were to in theory get a set of dual rank memory would I run into any issues trying to get XMP to work or manually "OC" the RAM (i.e. taking dual rank memory from factory clock of 2133 and putting it at 3200 using XMP and not having a single issue)? Any input would be super useful. I personally know what I want to do I just want to know if dual rank memory is worth it for 3rd gen. Thank you in advance.
  10. Firstly: a fine gentleman in the guru3d comments made a valid point saying and I quote, "The 3080Ti having almost twice the cores than the 3080 seems a little fishy..." I have a hard time trying to justify any comments denying such a statement. The only comment that could be made in opposition would be perhaps the name SKU is misleading and Nvidia is considering reverting back to the original release format that they did with the 10xx series and prior. But that would be dumb as it's been proven that releasing the Titan before the Ti only cannibalizes Titan sales, not that they really sell a lot of them in the first place. Secondly: The geekbench results throw a massive red flag when you look at the CUDA information. Now, someone correct me if I'm wrong but 1.11 GHz max frequency on a 30xx compared to a 1.545 GHz boost clock on a 2080 Ti with base being 1.35 GHz seem to be a little bit of a step backwards. I would be willing to argue that just because you theoretically have near double the CUDAs, doesn't mean you should back off on clocks, if anything keep the clocks the same or give a marginal bump in clock speeds. If I were to spit ball here, which I am, I would spit ball that the info in this spread sheet could possibly be pointing to a workstation card or a server card. Like I said before, I'm spit balling here based on what my gut is telling me, I'm sure someone with more knowledge or experience around this stuff would be willing to fill in the gaps or correct me on any of this. Not trying to be rude, I'm just laying things out that I find that could be red flags.
  11. Damn, I wish my NZXT H400 had the space for a second one AND all of the watercooling I'm about to do to the poor thing.
  12. Not trying to flex everyone, just want to make that clear. These were genuinely my most recent purchases.
  13. Banned for being a literal buzz saw in space.
  14. I had a feeling that might of been it but God damn talk about weird. Anywho, have fun.
  15. Banned for having a wraith spire cooler.
  16. Banned for being a part of the ceiling gang.
  17. Okay, I would say wait until your PSU arrives and if you're having the same issue come back to this post I guess. Out of sheer curiousity, what PSU are you getting?
  18. What sort of codes is you board throwing? Are they throwing any at all?
  19. Yeah I'd say those results are pretty standard for a stock system. I ended up pulling something similar on TS when I had my 2600X and 2070 Super.
  20. I'm gonna air on the side of no. Nvidia suggests a 650W PSU if you're gonna run a 2070 super and having pulled the TDPs of the CPU and GPU from the mannufacturers page you're sitting at 280 watts on just those two items alone. Now, let me remind you that the TDPs are more than like an average of the idle and load TDPs so your load TDP is more than likely going to be somewhere in the neighborhood of 350-450W.
  21. I have an easy three step process for you on how you should handle this: Step 1: get a non founders super Step 2: overclock the non founders super Step 3: profit Or you could just wait till the RTX 30xx release and upgrade to a 30xx GPU if it's not price gouged like the 20xx series was.
  22. Allow me to put it as simply as possible, all the parts in my build are stock clocked. I haven't even attempted to turn on XMP or manually overclock the RAM. I don't want to do that until I have the build on a water loop.
  23. Fire Strike Extreme CPU: AMD Ryzen 3950X @ 3500 MHz | 4376 MHz (BC | TC) GPU: Nvidia RTX 2080Ti GPU Core: 1980 MHz GPU Memory: 1750 MHz Score: 16527 http://www.3dmark.com/fs/22413563 https://pcpartpicker.com/user/666BOOMBOX666/saved/wKRNQ7 Note, the water cooling components, storage, and PSU that are shown in my Pc Part Picker list are not in my build yet. I will update my FSE score when the PC is 90% complete (i.e. Watercooling and PSU are installed and everything has been overclocked).
  24. I'm genuinely not trying to come off as a prick when I say this however my question is this: why are you making a game engine? What are you attempting to achieve with this that you can't achieve with any other engine. Again, I'm honestly curious and am not trying to come off as a prick.
  25. No not really. I had a stranger walk up to me and start talking to me as if we were friends. I ended up asking the dude if he was the president then told him to shut the fuck up when he answered no.
×