Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Mira Yurizaki

Member
  • Content Count

    20,911
  • Joined

  • Last visited

Awards

This user doesn't have any awards

About Mira Yurizaki

  • Title
    Beep boop

System

  • CPU
    AMD Ryzen 7 2700X
  • Motherboard
    MSI B450M Mortar Titanium
  • RAM
    2x8GB DDR4-3200 Corsair Vegnence LED
  • GPU
    EVGA GeForce RTX 2070 Super XC
  • Case
    NZXT H400i
  • Storage
    250GB Samsung 970 Evo, 1TB Crucial MX500, 1TB 2.5" Seagate Barracude Pro
  • PSU
    Corsair RM550x
  • Display(s)
    ASUS PG279Q, Dell P2715Q
  • Cooling
    Corsair H100i Pro
  • Keyboard
    Corsair K70 Lux
  • Mouse
    Logitech G502
  • Sound
    Sound BlasterX AE-5, Logitech Z906, Sennheiser HD6XX
  • Operating System
    Windows 10 Pro

Recent Profile Visitors

82,922 profile views
  1. If anyone was regularly looking at my profile, they might've noticed that I haven't logged in a while. The short of it is, I'm finding myself not interested in coming back. I've had points where I took a step back from the forums and then came back. However, the last time I decided "if I'm going to keep doing this, why am I even coming back?" I just stopped showing up. And then I mulled over if I should at least say something, because every time I left some other community, I just left and I figured I could at least this one time break that.

     

    So here it is, my last post that I plan on doing, because I don't plan on coming back. I'm not going to ask the moderators to do anything special to this account and will just leave it as-is. As for why I'm leaving, there's multiple reasons and if you're really dedicated I'm sure you can infer what they are. Or you've already talked to me on Discord or something about it. Saying them won't change anything, because this place isn't for me anymore and it's not like I'm anyone important anyway.

     

    I'm not shunning all contact from who I talked with on this forum. I am leaving the option of sharing my Discord handle and if you found whatever I wrote interesting, I started a blog. However, you'll have to DM me for the Discord handle, since I'm not publicly sharing that :P  Though I will start buttoning up this account at the end of the month to move on.

     

    It's been a pretty good almost 4 year run and I hope to have at least made a difference somewhere, but I don't think I can continue with this place.

    1.   Show previous replies  2 more
    2. seee the state im in nooow

      seee the state im in nooow

      the internet's a big place, we all choose our vectors on communication. don't feel bad

    3. TVwazhere
    4. PlayStation 2

      PlayStation 2

      Can't blame ya for losing interest in here. Peace, man.

  2. You'd have to go into BIOS/UEFI and adjust the settings so that it configures the RAM to a faster speed. However, this does not guarantee that the RAM will be able to run at that speed. If the board says it's compatible only up to 1333, then it likely won't like anything higher.
  3. I don't think it works that way, that Apple can make someone else's chip without some negotiation through the owner of the IP. Otherwise other companies could just ask for Apple's A series SoCs. And the thing with leveraging chiplets is yes, they can make it. But their current APUs are not designed as MCM processors. They'd have to make an entirely new design which doesn't really make sense when they should be leveraging what they have. Apple would have to fork over the money for that, which I bet they won't unless they make AMD also fork over the manufacturing rights for that SKU,
  4. But AMD doesn't actually make the chips. TSMC does. And Apple already contracts them for making their mobile processors (I believe Samsung is a second supplier, but given Samsung is a bit behind, they're probably only used for lesser SoCs if any) Also the APUs are still monolithic: So AMD can't even leverage chiplets for this.
  5. It is microSD cards. But the only reason why nobody talks about them for usage in any serious long term storage solution is because they're basically bottom-barrel flash chips so you're not getting high performance or high reliability that's suitable enough.
  6. Looking around, I'm only seeing cards that either have the USB A ports right there or a header, but not both. However, one thing to note is that USB 3.1 Gen 2 cards require 4 PCIe lanes to work (likely at least PCIe 2.0 speeds). Otherwise there's no real point. So looking at your motherboard, there's only one other slot that can provide 4 lanes, which is the other graphics slot. This will cut into the lanes the graphics card will get by half. The other x16 slot is really a PCIe 2.0 x2 slot. It probably won't matter since graphics cards aren't hampered much by going down to 8 lanes.
  7. I do on my desktop Arguably that depends if you want to keep 96 PPI scaling or not. My phone is 2560x1440 but the UI is scaled properly.
  8. There's always two camps of PC Gamers: one who declares the next "Crysis" is some game, and another who says the same game is horribly unoptimized.

    1. Mira Yurizaki

      Mira Yurizaki

      On a side note, Crysis itself is not optimized to scale on today's hardware, so I guess both sides are technically the same thing ?

    2. TopHatProductions115
  9. "Committed" is how much virtual memory space is available and in use. Virtual memory space is physical memory + page file size (which could be 0)
  10. I think the moment game rendering went to physically based rendering was when things started to look more or less photorealistic. Lighting makes a huge difference with regards to how "real" something looks.
  11. It'll boot fast, considering UEFI sizes are still around 16-32MB. It just won't be very useful since there's few hardware interfaces that are simple enough to work with.
  12. Everyone appears to be sourcing this Tweet: I don't know about you, but a screenshot of what appears to be a random text file with code names doesn't seem indicative of anything other than just that. And I can't find anything that would lead this person to be credible about anything. Also poking at what other Tweets they posted that are related, it seems to only point to GPU technologies and related. To me, if this is from something in macOS's code base, this points more to a video driver file that had extra stuff hanging around than any indication that Apple is going to
×