Jump to content

JMSOG

Member
  • Posts

    96
  • Joined

  • Last visited

Awards

This user doesn't have any awards

1 Follower

About JMSOG

  • Birthday May 05, 1996

Contact Methods

  • Twitter
    https://twitter.com/Dylan_Beight

Profile Information

  • Gender
    Male
  • Location
    It's relative, where are we all really?
  • Interests
    Music Production, Computers, Gaming, Music
  • Biography
    I'm an industrial music producer with a hobby for building and upgrading computers.
  • Occupation
    College student

System

  • CPU
    5950x
  • RAM
    32gb Trident Z RGB
  • GPU
    MSI 2080ti
  • Case
    Dark Base 900 Pro
  • Storage
    Samsung 970 Pro 500gb (Boot Drive), Unknown 2tb Seagate, WD Black 4tb
  • PSU
    Corsair HX1000i
  • Display(s)
    Acer XB270HU, Unknown 60hz 1080p, unknown 60hz 720p
  • Cooling
    Noctua NH-D15
  • Sound
    ATH M50x
  • Operating System
    Windows 10

Recent Profile Visitors

1,380 profile views
  1. I don't remember exactly, but my GPU load is regularly sub-60%. VRChat on my current rig is *absolutely* CPU bound. Trust me here: I am beyond certain that my current rig, in the case of *specifically* vrchat, my CPU is the bottleneck. 100%, absolutely no room for nuance. VRChat is weird man, lol.
  2. Alright, I had a full night of sleep, I'm expecting this to be a lot more coherant! Thanks for the responses everyone I was very aware of the dual-CCD-ness of the 7950x3d, and it's one of the core (lol) reasons why I wanted to avoid it. There are ways I know of to force VRChat specifically onto the correct 8 cores, but having to (potentially) do that for not every program, but *any* program, felt...frustrating. I was aware that in games (VRChat specifically), they tend to be pretty neck-and-neck, without there being a clear winner in terms of raw performance. There was one feature I knew of that *could* make it worthwhile for my purposes, and that is using a program to force any non-vrchat program (discord, firefox, steam itself, etc) onto the non-VCache CCD. However, I haven't seen anyone actually do this, or confirm it would have a notable benefit. As far as price, the 7800x3d will cost more due to me having to buy a motherboard and DDR5 lol. But I already weighed the pros and cons of the 5800x3d and the 7800x3d offline, and already came to the conclusion that I'd rather jump platforms. This was exactly my thinking. Buy a first-gen AM5, and a last or near-last gen AM5. Felt like the best long term solution. Maybe. We'll see. I would disagree about it being worth it. Something about VRChat, it's engine, and the kind of assets that exist/get uploaded absolutely LOVE the 3dcache CPUs. I have heard performance gains of as much as double. This is based several examples where people have one-to-one compared VCache cpus with non VCache CPUs, me personally knowing many people in game who have made the jump, and me comparing my performance in-game to people with identical rigs to me, minus the CPU. I would say it is personally worth it. I spend a lot of time in VRChat for social reasons, that game is an absolute resource hog. Spending 3-4 hours with a screen on my face running at 50FPS instead of 30 FPS sounds...absolutely incredible. I have a tool called FPSvr that i regularly use to monitor my performance in-game, and I check it regularly because I am like that lol. It tells me a lot of things, but the key notes are framerate, frame *time* (including how much both the CPU and GPU contribute to said time), as well as VRam and Ram usage. My network bandwith is 1gig up, 1gig down, and I regularly check that it actually *is* that speed. I connect to VRC servers typically on the same coast as me. Basically, without living next door to the correct datacenter, my network isn't going to get better. My GPU is also fine. I'm not exceeding my VRAM limit regularly, and my GPU frametimes are always VERY good in comparison to my CPU (WHILE IN VRCHAT). My CPU framtime, however, is usually clocking in at about double-triple what my GPU is getting. Anectdotally, this is a bit funny, but I have on at least a few occasions been hit with workloads in VRChat *so* CPU-bound that when they hit, I can see my CPU freaking out frametime-wise, while my GPU actually cools by as much as 10 degrees. it's hilarious. __________________________________________________ Overall in response to all of the above: I am certain that an AM5 3DCache CPU will provide a performance boost that, genuinely, will be meaningful enough for me week-to-week life that it's worth it to me. I am getting the impression that the 7800x3d is, in fact, the correct move. I'll hold off from hitting the *buy* button until tonight, just in case someone comes in swinging with a good counter argument, but I think I'm set now. Thank you everyone
  3. I know threads like this are annoying, but I have looked everywhere for the last 2 weeks and can't find anyone who actually did this move. I'm looking for an outside perspective so I actually know which trigger to pull, so I can fully take advantage of the 3d cache. Short version: I have a 5950x. The main thing I have used my computer for for at least 2+ years (VRChat) will 100% certainly have massive performance gains from the 3Dcache, enough to make this more than worth it. Will dropping the 8 cores be outweighed by the improved cache/ddr5/improved frequency, or should I do the 7950x3d? Longer story/list of my considerations: I play a LOT of vrchat, more than any other game. That game has proven to have absolutely nonsense performance gains with 3dcache CPUs, so this is undeniably a worthwhile jump. I don't want to go the route of the 5800x3d, since it feels like too much of a compromise this long after it's release. I also would rather jump platforms to AM5 NOW so I can get in on the AM5 ground floor, and *hopefully* my NEXT CPU upgrade after this will ONLY involve buying a new CPU, not a board/CPU/ram. I don't do production tasks as much as I used to. I'm expecting that, in the occasional situation where I need to edit a video, the 7800x3d will be good enough. Maybe I'm getting old, but the (apparently) 50-100 watt lower whole-system power usage between the 7800x3d and 7950x3d (or even the 5950x) seems...really nice for my power bill. I have proven that, during the summer, I have a significant difficulty keeping my apartment cool when I use my desktop. Dropping the power consumption should (in an admittedly small way) help that. The fact that the 7800x3d doesn't have the inconsistent core parking issues is...appealing. I can afford the 7950x3d, but it would obviously be *nice* if I only spent the money on a 7800x3d. I am obviously not considering the 7900x3d. So, basically, based on how my computer is currently used, A recent 3DCache CPU seems like an incredible move, and reflecting on how my priorities have changed over time, the 7800x3d seems absolutely perfect. The *only* thing that could change my mind is if dropping the 8 cores would have a noticeable performance drop in my day-to-day basic computer use/multitasking, seeing as I have been using a 16 core machine for a while at this point. So...has anyone else made that VERY SPECIFIC 5950x->7800x3d jump? Would the 7950x3d be better given my usecase, or am I right that the 7800x3d is fairly perfect here? Sorry if parts of this are a bit incoherent/are poorly worded/have bad grammer, I am tired and writing this at 11:35pm. I'm just tired of stressing out over this decision, and just want some outside opinions lol. Thank you all for your time.
  4. Oh, I also wasn't clear on this: I already have the switch. Got it (at a heavy, heavy discount) for something different years ago, and it's just been collecting dust in a closet. So, both layouts would cost me 0 additional dollars. This was purely a question of "which layout would work the best". I super appreciate all the feedback, by the way. This is all helpful.
  5. My concern was less the 25 feet, and more whether having two devices sharing that 25 feet would have any practical/measurable downside over having two separate 25 foot connections. I'm getting the impression that the answer to my question is "not really". Which is helpful.
  6. This is a dumb question. I KNOW this is a dumb question. I know that the differences will be microscopic. I just want an understanding of what the difference will actually be, both technically and performance-wise. I'm likely about to move to a new apartment soon, and I'm planning out my network. I have two computers that will likely use a decent amount of bandwidth, and regularly will need to communicate with each other via the network. There's two possible layouts I'm considering (trust me that these are the only two that make sense): Router-> 2x 25ft ethernet -> two computers Router-> 1x 25ft ethernet -> 1 gigabit switch -> 2 computers The internet connection coming in + the router itself + the ports on the computers + the switch are all gigabit. While I don't know for sure what ethernet cable I have (5e, 6, 8, etc), I know for a fact that it can carry a gigabit signal as well. I'm aware that in 99% of cases, I won't notice a difference. Possibly including this one. And, again, I know this is a dumb question. But, basically: would there be any measurable/practical difference between snaking two ethernet cables around the apartment vs just one? Thank you all in advance.
  7. My friend recently upgraded their macbook to the most recent model. They want to install an antivirus, but were pretty unhappy with Sophos (the antivirus they previously used). However, they also don't know where to find a good alternative (that isn't just an ad for McAffee or Norton or whatever). Honestly, macs are something I'm not super up-to-date or informed about, so I don't know either. What would everyone recommend as a good antivirus for a new Mac computer? I will be sending them a link to this thread. Thanks everyone!
  8. TL:DR; is there a good reason NOT to use primocache? Fuller explanation: My system has three drives in it: a 500gb NVMe boot drive, a random 2tb hard drive from 2013, and a WD Black 4tb drive from 2015. I'm not doing anything complex with these drives...they were plugged into the computer, formatted, and that was it, no RAID or anything. To shorten a long story, it turns out I had a 500gb NVMe drive just sitting around, which was (close to) never used. I don't think throwing another 500gb of SSD storage into my system will be especially helpful without adding a lot of management, so I thought it might be interesting to use it as cache instead. I know Linus seemed to sing the praises of PrimoCache about 1-2 years ago in a video about using Optane on an AMD system. Frankly, it looks pretty good to me: 1: it's non-destructive, so if my cache randomly dies, I won't lose anything (unlike StoreMi) 2: it can support capacities of over 256gb, something StoreMi can't do without paying a lot more than $30 3: It can support multiple drives, which means I could cache both my 2tb and 4tb drive. The thing is...and this is just my thinking...if it was this perfect, why can't I find more people talking about it on youtube? Why isn't everyone using it if it is this good? Am I missing something? So, before I put 6 terabytes of data at risk, I felt I should ask all of y'all...is there a good reason NOT to use the random NVMe drive with primocache in this case?
  9. TL:DR, I need to find a keyboard that can take a massive beating and still work long-term. Longer story: My mom is a heavy, fast typer. She claims that this is a combination of having learned to type on a typewriter, and having previously had jobs that required a high word-per-minute type rate. This translates to her typing having the equivalent destructive force of machinegun fire. She loves spending a significant amount of time playing typing games online as well, which means that she breaks keyboards very quickly. About two/three years ago, I thought I would counter this by getting her an unbelievably good keyboard: the Corsair K65. It's built incredibly well, and I thought that the speed switches would be nice to type on. Plus, my keyboard is a K95 (basically a k65 with more features), and I love it, so I thought that she would too. Well, it lasted longer than any keyboard she has ever had, but in the last year keys have literally started to fly off the keyboard while she types. It's kind of hilarious. My initial thought is to just get her another K65. Like i said, it lasted longer for her than any keyboard before, and she says that she loves it. But: I got that keyboard because I thought it would last 10 years, not 3. So doing it again feels like a massive waste. Like I said, her typing is absolutely destructive. I thought it would be a good idea to crowdsource information on this one: are there any keyboards more durable than the K65 on the market?
  10. Strangely, I cannot find a lot of opinions about this precise question. Or maybe I suck at googling. I currently have a decent 1440p monitor. I'm considering making my final big tech purchase for the next several years a decent 4k monitor. So, for those of you who've upgraded from 1440p to 4K in the past...how noticeable is it? can you really appreciate the difference? Keep in mind: I know about the differences in refresh rate. I know how much more horsepower it takes to run something at 4k vs 1440p. I know about response time, I know the size of the monitor matters, I know ultrawide exists, ETC. I'm asking about, in a vacuum, how noticeable, appreciable, and useful the upgrade in resolution BY ITSELF is. Thoughts?
  11. It sounded a lot like a 4g hotspot, which was what I initially thought. It took a while, but I confirmed that that is not the case: They've been in places where there is no network coverage for their phone, which is i think why they are looking for this thing. And that's about what I expected as well. the performance is not going to be super great, which i will probably need to communicate to them. Is there a particular model or brand that you're aware of that tends to be more reliable?
  12. This is probably a stupid question, but I am very unfamiliar. I am my family's tech guy, and they normally turn to me to find the best version of X item for anything they need (keyboards, screens, laptops, etc). They have asked me to find what they call a "satellite wifi hotspot". Something which connects "to a satellite", "not a phone network", and generates wifi that can handle two devices. I have no idea what they are talking about, but they are describing this device like they have seen it before. I am not sure what to call this device, and don't know what to look for. Asking y'all seems easier. What is this device? does it work the way it's being described to me? is there a brand that makes the "best" one? What price am I looking at here? Thanks!
  13. I will update this later when I actually see the unit, but I doubt I will learn anything new. This was how it was described to me. My friend's laptop last night apparently started "leaking blue fluid through the HDMI port". My best guess is that the battery is messed up. A few weeks ago, she plugged a power adapter into her computer with the wrong amperage and voltage, which is my best guess on how the damage occurred. MY BEST DIAGNOSIS based on this: I think her hardware is a lost cause. I suggested having me or someone she knows remove the harddrive ASAP to save her data. Again, I won't physically SEE the laptop until later today, but "blue liquid leaking out of HDMI port" is pretty conclusive. Question: Does it seem like I am correct?
  14. What's the current status of the notorious 2080 ti issues? Have more recent batches been fixed? Haven't been able to find anything concrete.
  15. So, my primary monitor is a Acer XB270HU, and I've used it since may 2015. I was just packing up my computer to head to college, and then it did this: had strange coloration on half the display. It even happened when I disconnected it from the computer, with the "black screen" displaying the strange coloration. I took this picture in order to ask the forum "is my monitor dead", but when I looked up from my phone, it had fixed itself...? Can anyone explain what happened, and whether I should be concerned?
×