Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Kisai

Member
  • Content Count

    1,049
  • Joined

  • Last visited

Awards


This user doesn't have any awards

3 Followers

About Kisai

  • Title
    Veteran

Profile Information

  • Occupation
    IT Support at a 14 Billion dollar Fortune 500 company.

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I think build quality across the board has gone down with ultrabook's. One of the Dell techs was telling me that Dell is now soldering the SSD's in their ultrabooks, and there are no cooling fans.
  2. PD is better than QC option, and you can get USB PD devices that support QC via PD (Quickcharge 4(+)). The catch of course is that only certain devices have a chip that supports that. This is based on the chips listed in Wikipedia cross referenced with the devices listed on wikichip: Samsung Tab S5e, Google Pixel 3a Samsung Galaxy A70,Samsung Galaxy A60, Samsung Galaxy M40, Motorola One Zoom, Motorola Z4, Motorola One Hyper Samsung Galaxy A80, Samsung Galaxy A71 Google Pixel 3,Razer Phone 2,Samsung Galaxy Note9,Samsung Galaxy S9,Samsung Galaxy S9+,Sony Xperia XZ2,Sony Xperia XZ2 compact,Sony Xperia XZ3 Samsung Galaxy Tab S6,Samsung Galaxy S10+,Samsung Galaxy S10,Samsung Galaxy S10e, Samsung Galaxy Note 10,Google Pixel 4 If it' s not there, USB-PD QC is not supported at all. I omitted some brands for brevity, but basically you're looking at devices that were shipped in 2018 or later. Now I can not say for certain if all the devices actually use USB-PD to begin with, but if you're thinking of replacing a mobile device at a later date, getting one that supports USB-PD QC would be more future-proofable than getting one that doesn't. With that said, RAVPower doesn't appear to advertise "Quick charge 4" USB-PD is the way to go though since QC is proprietary. There should be new USB PD 3.0 devices out now. If you however just need something right now, then just buy whatever is suitable for the devices you have now.
  3. You might be SOL because of the resolution. If you're trying to operate two machines in this configuration, the OS has to be told that is a valid resolution from the EDID, and most of the time, you only get the option for 1080p or 1080p in vertical orientation. You might be able to have the EDID ignored and see if that works with the box, but that may simply not let the box work then. My suggestion usually for dealing with something like this is to suggest planning ahead and buying hardware in configuration you need in advance. Alternative: https://docs.microsoft.com/en-us/windows-hardware/drivers/display/overriding-monitor-edids This would be easier to do if the monitor already has an INF file and then you can just modify it.
  4. Remember "My Briefcase" from Windows 95? Basically that. There are of course other ways of doing this, like just using dropbox/onedrive.
  5. You can either use a PCIe add-in card to a desktop, or you can use USB hubs for most of the low-bandwidth devices. High bandwidth devices (eg flash drives, SSD's, video capture equipment) should not be put through a hub and should be on the chipset USB 3.x ports only. Keep in mind that "do not use USB hub" tends to be an advisory for audio hardware and capture hardware due to the latency induced by the device itself, however that doesn't mean you can't. If an audio device is just a "DAC" so to speak, it uses likely 1.5Mbits per stereo channel pair. So on USB 2.0 that's barely a dent in anything. Video capture on the other hand can literately drown USB, requiring 16Gbits/s, just for an uncompressed 4K RGB stream or 4Gbits/s for HD. USB 3.1 is 5GBits. 5Gbits/s is equal to 1 PCIe lane. So if you're trying to capture video using USB 3.1 Gen 1, it needs to basically monopolize the entire USB 3.1 gen 1's bandwidth. Hence you would not want anything else sharing this port, and you won't know what shares this port without testing the board. 4K capture, can not be done on USB 3.x, it can presently only be done on Thunderbolt 2 (20Gbits/sec) Of course there ways you can compromise the quality to make it fit (eg NV12, onboard h264 compressor) on USB, it's just no longer high quality then. You're better off with a PCIe card 4-lane card that does this and not have USB at all for it. Multi-channel audio equipment may have the same issues. USB2.0 stuff will still be USB2.0 on a USB 3.0 port.
  6. If the RGB stuff doesn't interest you, then don't buy the RGB parts. There is no effect on anything except power usage.
  7. This is like asking if you can park an electric car in a gas powered car's parking spot. The only thing that matters for the GPU on the MB is the PCIe version, and since they are backwards compatible, the worst you can ever do is plug a PCIe3 card into a PCIe4 slot and have it operate at PCIe3 speeds.
  8. I would think you are hard of hearing if you can't tell the difference.
  9. *trying to locate where the thread went off topic. Yeah, most USB and HDMI cables just aren't marked with what version they are, and that's why devices keep coming with them. At the office I have: 1 bin of DVI cables 1 bin of VGA cables (I actually disposed of one bin last year, and I'm still finding devices connected with VGA that should be DP) 1 bin of DVI to DP and DVI to VGA adapters 1 bin of nothing but USB cables that should have been attached to the monitors but for whatever reason, never used. 2 bins of nema 5-15p to c13 (standard power cable for north america) And the problems: DVI: Which are dual link? VGA: New monitors don't come with VGA and no device in the last 5 years has come with a VGA output DP adapters: Some are passive, some are active, and some screens flicker or experience malfunctions when used. USB: Black, White, Blue. I know the Blue ones are USB 3, but none of them are labeled. They're all Type A to Type B. Occasionally I find a micro-b cable in there. Power: Some of these go back to the 90's, they range from 15A cables (thick) to 5A cables (thin, for monitors and laptop bricks (usually short too)) We tossed the one bin of VGA cables primarily because we still had another bin of them and more come back than go out. The WD15 docks have one HDMI, one mini-DP and one VGA, so in some cases the VGA cables were left in place, but I replace the VGA's with the mini-DP and HDMI. The WD19 has dual DP only, which makes them usable with the DP cables that came inside the box with the monitors. But the monitors wouldn't need to come with any cables in theory. I have them all. Even with laptops, there's no reason to keep shipping power bricks with them (which add $100 to their cost) if the user may have one already, but they are shipped because that's the correct and appropriate brick. So if we move to USB-C, any laptop that requires less than 100w will be powered by USB-C PD only. Some of the XPS and Latitude's are already shipped this way. That's just following in Apple's footsteps like with the Macbook Air. So the market is already going in that direction, and the only reason Apple is making a fuss is because that's someone's job. They clearly see it's targeted at them since the iPhone is the most popular* smartphone. *most profitable
  10. PCIe capture card via the Componet Video if you don't have an appropriate thing to strip the copy protection, or just straight HDMI input if you do, or the firmware doesn't turn it on anymore. For logical reasons I'm not going to tell you how to bypass the copy protection measures, google however knows what does it. Alternatively if swapping the cables is an issue, you can also buy external auto-switches or manual switches.
  11. So you trade worse quality for headphones that only last one or two years over easily repaired wired headphones that easily last 30. As far as replacing earbuds goes, I've had earbuds go through the wash, get stepped on, and get misplaced , and that happens regardless if they are wired or not. In one way earbuds was a good invention because it's effective at keeping outside noise out, but at the same time, they're pretty terrible audio-quality wise. I much prefer the larger headphones other than the fact that I don't like things on my head.
  12. I'm not sure why people like wireless headphones and earbuds, sound is bad because it's wireless, bluetooth audio is bad because it crushes the sound. Like it's passable in a car or on a mobile phone because you use these devices typically in non-quiet environments in the first place so you don't really notice or care how poor the audio really is. The "correct" way to listen to music, or watch movies on the PC is to have a PCIe sound card that has it's own DSP (games have less latency in the audio then,) next to that are USB 2.0 or better DAC devices which are fine for over-the-ear headphones, but generally only for music. Even your run-of-the-mill wireless headphones/headset's that have integrated microphones, the microphone ends up being the most worthless part of the headset, being only good enough to capture adult male's frequency range, and makes everyone sound like they're underwater. That's the primary problem with bluetooth audio, is that the devices that support A2DP only support lossy mp3 and aac audio if they support it at all, otherwise they use SBC (1/3rd the quality of CD) which is not very good. They do not support lossless audio. So unless the device explicitly states it supports AAC or even AptX, it will never approach CD quality. So the default is simply bad, and there is no upgrade path in that. Bluetooth audio devices are essentially rubbish, because there's no way to tell if a device is good or not. At least not without pairing them to a device to actually interrogate the audio codec used.
  13. Headphone jacks requires installing the USB audio driver and then the system audio driver AND the mixer software, in that order. Windows update will download a working driver but it won't install the control panel to have it auto switch from speakers to headphones and back. This is a realtek issue and it plagues all dell systems. Basically, if you plug in, or unplug an audio device you should see a popup asking "which device did you plug in" unless you previously dismissed it to never show again. It literately waits for you to hit OK before it switches. And as for Thunderbolt firmware, I don't know how Lenovo managed to screw up the TB firmware, but the TB firmware is not something installed into the OS like USB firmware, it's something written to flash somewhere as it survives OS reimaging. Speculatively speaking, if a corrupt firmware can kill the TB3/USB-C controller, then that speaks to some incredible incompetence. On a Dell, if the Thunderbolt3 hardware becomes disabled due to a bad TB flash, you just pull the bios battery, resetting everything in the bios, and somehow this makes it enable-able again and you can then flash it again. I don't know if that's something that would work on a Lenovo if the TB hardware stops working, but something to consider.
  14. I would not recommend it, even if you remove the physical port, the OS will still try to power it, so if you short it in the process, you've destroyed all the ports. If you want a smaller board, you should buy a smaller board you might consider a Nano-ITX, Pico-ITX or mini-ITX, boards which try to cram more into a small space for an embedded device, but these are not really functional for anything but embedded systems.
  15. Nah, IT departments at corporations are just going "we're not replacing anything in 2020 that isn't 5 years old"
×