Jump to content

Kisai

Member
  • Posts

    7,652
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Female
  • Location
    🍁
  • Interests
    Computers, Anime, Video Games, Video, Photography, Networking, Servers
  • Occupation
    IT Support at a 14 Billion dollar Fortune 500 company.

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. if you're turning your head, you're too close to the computer for the monitor size. Optimally your main monitor is slightly to the right or left, and the secondary monitor is the same size (not necessarily resolution) as the main monitor. Basically, 24" monitors are usually the ideal size for coding because you're only going to turn eyes more than your neck. When you get into larger monitors you end up having to move your head more, in which case you may as well just a use a 32" monitor and operate it at 4K than have two 1080p monitors. All a matter of preference though. Also consider a different orientation if you're taller, because if you can't look at the monitor without turning your neck, you might want to use a dual monitor arm setup and move the screens to where they are comfortable. I do all my stuff on the main monitor and put the passive stuff (eg looking up things, watching youtube, etc) on the secondary. To give you an idea of my setup. It's 2x 23.8", with the main being 4K. The second doesn't tilt so I have them like this: ========= | || | ========= | KB | The split between the monitor is roughly where the F3 and F4 are. Ideally I would have monitors with thinner bezels, but it's really hard to find that 4K IPS 24" non-glossy monitor configuration.
  2. Maybe a bit distracting if it's not showing something that needs to be in the shot. They might be using a synced multi-camera setup, in which case the correct way of not making it distracting is to not pause between the cuts, but rather to switch the correct camera when his head turns while speaking. It's not something I'd advocate to change per se, but it's one of those things that makes it seem like the video gets padded by doing so.
  3. This is the most likely reason. At the building I'm at (which has dozens of floors) Fedex still ding-dong-ditches with "sorry we missed you" door hangers that they stick on the window beside the entrance. One of these has been sitting there since April 29th. It's now 10 days later. That's more than enough time for someone to take a photo of the hanger, look up the code on the website and try and get it from the place the tracking says it was dropped off. Both Fedex and UPS are extremely poor at delivering to residential addresses in BC, and will likely force you to go pick up the parcel at the depot 2 miles away than actually deliver it. Canada Post, likewise will "lose packages" and their support options are basically "get stuck in chatbot hell telling you to to pick it up at your concierge" that you don't have. In an ideal situation, if you live in any kind of MDU (any building with more than 1 legal unit) there should be a designated location for parcels to be delivered to. If the building has a rental office (good rental buildings have one) then having the rental office be allowed to sign for packages should be an option rather than having the couriers ding dong ditch and not try to deliver anything. Because your package is more likely to be stolen at the depot because some of these depots aren't owned or operated by the courier, they're merely a place authorized/outsourced to leave parcels at. These can be convenience stores, pharmacies, copy shops, and the people who work there don't have any loyalty to the courier. Ultimately the fault lies with the courier, and technically it should be newegg filing the claim if you paid for it, because until you receive it, technically newegg still is in possession of it. Newegg's going to be the one to eat the loss if you chargeback.
  4. "Dumpster Diving" or go to places that sell them. Like I kid you not at least one of the places selling these is just a DHL/Fedex/UPS drop off point. If you work for a computer store/best buy it's also possible to just take things that the store wants to get rid of and doesn't want to pay to dispose of. Data centers are another option, since customers will abandon hardware. Most of the stuff you find on eBay, when you see a seller with dozens of the same item, basically bought it from a bankruptcy auction/estate sale or government auction.
  5. Usually a "screen going black" that isn't a connector issue is the "driver being reset" which happens on weak GPU's and "factory OC" GPU's that encounter a load that triggers a thermal fault. The correct thing to do is to see if the thing happens again. If it's exactly repeatable, usually that indicates a hardware failure. If it's seemingly random, that indicates a RAM or CPU fault, usually RAM. When a GPU is at fault, it will always fail under the exact same conditions. For GTX/RTX 70/80/90 parts and AMD's equivalent this can also indicate a power issue with one of the GPU power connectors.
  6. The only regulation that is going to happen is one that favors corporations. Be careful what you wish for. If full disclosure of the dataset in all public and private models is mandatory in order to commercially operate them, then that is what should be the status quo, and if a licensed dataset is used to generate a model, then any time that model is invoked, the dataset provider should be paid a royalty. That is what corpos want. ML models do nothing but plagiarize content. They are not human, they have no means of interpreting, adapting or improving upon something. They simply auto-complete. You say "tell me a story about an Evil AI" and they will find all the various phrasing of "Evil AI" in it's model and generate story that will have absolutely no coherance because does not know "the 7 basic plots" https://en.wikipedia.org/wiki/The_Seven_Basic_Plots, or "The Hero's Journey" https://en.wikipedia.org/wiki/Hero's_journey that damn near all story-telling mediums except video games abide by. Only Visual Novels and RPG's follow that in video games, with the rest being environmental or self-invented based on how the game bashes two rocks together, or the game's physics explode. An "AI" is never going to tell a compelling story. It's only ever going to produce piles of garbage and occasionally produce something funny from the incoherence of it all.
  7. This is how 'merica crumbles. All the popular IP's being owned by a small handful of companies, who are also the gatekeeper (Sony/Microsoft/Nintendo for videogames, and Warner/Disney for Video and Music) Like the clear writing on the wall for Windows/Xbox is that Microsoft at some point will make Windows too locked-in, and if you want your software on the Microsoft platform you need to put it on the Microsoft Store and make it available on the Xbox and Surface platforms. The only reason this isn't a thing is because it would attract anti-trust scrutiny. By hook or by crook, you will be forced to put your software on Microsoft's stores. But your game or app will never have marketing without publishing it through them. Anyhow these companies, jokes on them. IP farming only works when an IP is kept in a neutral state where they are willing to license in perpetuity. If I make a game using Warner or Microsoft's IP, they can not at a later date tell me to cease selling the game. Even if it's bad. I can't make a new game without a new license for a new game, but I should not be forced to re-negotiate the license on the existing game. If they want more money for every subsequent sequel, may as well dump the IP and build a new IP that is basically the spiritual sequel. I can name dozens of IP's that Microsoft already OWNS through it's acquisition of Activision, that I would love a sequel to (SIERRA "Quest" series being several neglected IP's) but these idiot companies would rather sit on the IP and produce nothing.
  8. Use the built in (samba) file sharing. Just a point of interest if you setup one of Microsoft's "passwordless" accounts (Eg using your email address as the microsoft account), it will NOT work. The work around to that is to create a non-microsoft account with a password and either set the ownership of the files or just make it an admin if you're gonna be the only one using it. You can also use RDP this way then too. Just be aware, again, you need the non-microsoft account because for some reason MICROSOFT didn't think ahead before requiring microsoft accounts.
  9. Chances are they snapped it off themselves and didn't realize there's a screw. "standoffs" come off all the time on the chassis due to them being tighter on the screw-to-standoff than the standoff-to-chassis. If the motherboard had a standoff screwed on, and broke it, then it's likely that it hasn't damaged the the actual PCB. They might even have used the same standoffs they use for the M2's. In which case it just needs to be screwed back in. But it needs to be pointed out that the M2's don't even need the armor plate. You can even buy M2's with their own Heatsinks (eg WD_BLACK), so IMO this is not rendering the MB inoperable.
  10. It's illegal to mine Uranium or build/operate a nuclear reactor in BC. So that won't be happening. That said, a fun, but very silly and expensive project would be to "off-grid" build a wind turbine. Probably not for LTT though. "What can we run on 2MW of power? BOINC? Crypto? SuperComputer?" I mean they are relatively cheap when you consider other options, just wind isn't 100% reliable.
  11. 1. Get a USB-C dock, and set your monitors to "auto-switch" Be aware that UBC-C (not Thunderbolt) docks are 2 x 1080p max 2. When you want to switch devices, either a) plug in the laptop and "suspend the desktop" (or put it on a 1 minute monitor sleep) b) unplug the laptop, and wake the desktop. Option 2 (no dock) a) plug both monitors into the desktop, and then plug both inputs into the laptop with USB-C HDMI adapters. This of course requires two USB-C ports that have display output. Not really recommended because unless it's an engineering laptop, usually there are only two USB-C ports and only one of them has the video on it. You can plug the laptop's dock into the desktop, you can even plug your desktop keyboard and mouse into the dock and then just use the laptop closed. Then just plug the USB-C back into the desktop when you need the keyboard and mouse. However this is going to result in wear to the USB-C connectors. Option 3: Just get a video switch This is just a random one off aliexpress. You would need a USB-C to HDMI/DP adapter for the desktop, but this otherwise would work. There's one other option. If you have a "USB-C" monitor. Most of the (Dell) monitors act as a KVM when the USB ports on the monitor are used, but if one computer isn't USB-C then you can't use it without moving the USB-C cable. The less complicated answer is to just buy what does exactly what you want it to, and if that involves a longer 2M HDMI or DP cable with a USB-C adapter, that's probably the cheapest option. Then use the monitor's button to switch inputs, or set the display timeout's to 1 minute and let it auto switch to what's being used. Basically the answer for "I need two monitors" is "dock", but "I need only one monitor" is "adapter"
  12. Synthetic benchmarks do not reflect real world usage, even at the best of times. If there is enough PCIe bandwidth then then it will simply have more latency the larger the render. Like I distinctly remember the 10% hit with the GTX 1080 at 4K. Where dragging the FFXIV window between the iGPU ( HD Graphics 4600 ) and the dGPU would see the framerate drop 10% but I never actually benchmarked it then. I just never used the iGPU to game on, only to watch videos on while working on the main display. Same as now, because under Windows 7, whatever starts up on one screen, is rendered on that GPU. Convenient when you just need a youtube video up. But as I stated, unless you have an actual use case for this setup, it is a bad idea to use the iGPU connections while you have dGPU connections available. Desktops do not have MUX switches, so getting most software to utilize the dGPU without something plugged into the dGPU is going to result in the iGPU's render engine being used. You would need to manually force programs to use the dGPU if it defaults to iGPU.
  13. Well fundamentally they went about this in the most stupid and lazy way. SSO's always result in privacy problems. Steam has a "Sign in with Steam", PSN has a "Sign in with PSN", just let people link the accounts in either direction and give them licenses for both if they own it on one. Problem solved. If people don't have a playstation they use the PSN with, or don't want a PSN account then they don't need to link one. Cross play has never been an issue in other games. You enable crossplay by simply having the game keep track of multiple friends lists and asking players to link their accounts so their friends can find them under either. That's all. If they don't want to, then they will never know when their friends on the other service are playing. Honestly, if this was going to be an anti-cheat/abuse prevention feature, they should have been forthcoming about it, but clearly it was just Sony not willing to read the room.
  14. This is fundamentally incorrect. What is happening here is a feature that is used by iGPU+dGPU laptops that have a mux to switch between iGPU alone or dGPU+iGPU mode. When you use it on a desktop, there is a large RAM use penalty in the Desktop Window Manager. How do I know this? Because that's how I operate my desktop. iGPU on one screen, dGPU on the other. If something is launched on the dGPU and dragged to the iGPU, it's usually fine, but the DWM cranks up use. If you launch something on the iGPU and then move it to the dGPU the same thing happens. Performance wise, there's always around a 10% performance nerf to whatever application when it's moved from one GPU to the other. It's also not a unnoticeable nerf either, because you will also notice the 2 FPS lag from the GPU to GPU transfer. Do not run your computer this way unless you have a good reason to. If you are doing video editing work, this is basically the optimal way to be able to use all GPU power and video decoders/encoders. Outside of that, the iGPU 3D performance is weak, and the only way to get a dGPU's framerate on the iGPU connected monitor is by starting the dGPU game/app on the dGPU and dragging it over to the iGPU. Some games allow you to specify the render GPU independently of the connected monitor. This will work, usually, but you're still going to get a noticeable latency and performance nerf. dGPU Render, iGPU display: dGPU on dGPU: Also note, the UHD 770 does not support DX12 Ultimate so this benchmark can not be run from the iGPU.
  15. Most "basic" speakers are worse than 25 dollar ear buds. For a computer to sound decent, it needs to be at least a 2.1 subwoofer set. Anything without a subwoofer requires a much larger set of speakers and you just won't find those anymore since they went out of style in the 90's.
×