Jump to content

makinbacon21

Member
  • Posts

    61
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Contact Methods

  • Discord
    makinbacon#5364
  • Steam
    makinbacon21
  • Xbox Live
    makinbacon21#999
  • Reddit
    Strong_Assumption_95
  • Twitch.tv
    makinbacon17
  • Twitter
    makinbacon21
  • Website URL

Profile Information

  • Gender
    Male

System

  • CPU
    AMD Ryzen 9 5900X
  • Motherboard
    ASUS ROG Strix X570-I
  • RAM
    32GB Corsair Vengeance RGB 3600 MHz
  • GPU
    EVGA GeForce GTX 1080 Ti SC Black Edition
  • Case
    ASUS ROG Z11
  • Storage
    Samsung 970 Evo 1TB M.2, Samsung 860 Evo 2TB SATA
  • PSU
    Corsair RM 850-smth
  • Display(s)
    ASUS ROG PG348Q
  • Cooling
    Corsair H105
  • Keyboard
    ASUS ROG Strix Flare
  • Mouse
    ASUS ROG Gladius II Origin
  • Sound
    Razer Blackshark V2
  • Operating System
    Windows 11 Pro
  • Laptop
    Surface Laptop Studio

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

makinbacon21's Achievements

  1. It is entirely samsung TV fw bugs. I have to basically disable CEC to get it to cooperate. So annoying. They seem to *generate fake EDIDs* per device based on its guess of what the device is capable of.
  2. OK so for anyone with this issue, we found a solution: use the 2nd (non-ARC) HDMI output only, disable ARC for kicks, hook up to TV via optical, and let the consoles run right through the receiver. Works like a charm.
  3. Let's start with some background. My school's computer club was granted money to purchase a TV, current-gen consoles, and a new AV receiver. We went with a Samsung Q80 something--4K, 65", 120Hz, the whole package. Alongside that, we purchased an Xbox Series X and a PS5, and followed the purchases up with a new Yamaha RX-A4A receiver. All incredibly expensive equipment by reputable manufacturers, so we assumed the setup would "just work." It did not. Let's dive in. RX-A4A The A4A is a 7.1-channel audio receiver marketed as being able to do full 4K120 HDR passthrough with ALLM, Dolby Vision, and other protocols being preserved via its many HDMI inputs and 2+1 (zone 2) outputs. Unfortunately, it does not properly broadcast this fact to the devices hooked up to it. This is likely due, as far as I can tell, to the functionality being added in a firmware update--it would appear that either whatever EDID database each connected device is looking at shows the A4A as not supporting higher bitrate content, or that the device itself broadcasts the wrong capabilities. It also completely disables video out whenever the screen blanks from a client device--i.e. when you load into a game on either console. God of War Ragnarok blanks till the end of the Santa Monica Studios logo segment, for example. This is similar to an older bug on 2020-era Samsung TVs where Game Mode would spam-toggle on and off during the early loading/splash screen segments of games. The A4A has less issues with EDID recognition on its second HDMI output, where it seems to pass through the host's information--the TV recognizes the input as the host device (i.e. the Xbox Series X tile shows up on the awful Samung game menu), and the device recognizes the display out as the TV. The HDMI out 1 is still needed, as it supports eARC and the second does not. These points likely related. However, the audio input gets automagically mapped back to the TV via CEC or some other detection protocol. Output over HDMI out 2 --> turn on console --> tv makes the game mode enabled high pitched awful noise --> audio input goes to TV and console cannot output audio. Trying to change the input blanks the screen, and then the TV gets some CEC input to switch back to HDMI out 1 carrying the eARC signal. This remains true even if you disable HDMI video output over the first port. Samsung Q80 I will not count the absolutely terrible interface (2021+) as a bug, but it should be noted that this is the single worst interface I have ever used on a TV, including the Panasonic Viera Link interface from 2010. I am talking ultra-bad. Anyway, the first issue I'll talk about is the broken audio passthrough interface. While the TV is perfectly capable of doing eARC audio, it will flat-out REFUSE to broadcast the ability to play anything above stereo, including on passthrough mode. Dolby is of course the exception here, but I would like 5.1/7.1 uncompressed to be available, and for some reason it is not. Using the PS5 or a PC, it is possible and easy to override the advertised capability. In Windows, you can just hit Configure in sound settings to select the configuration, and the PS5 has a nice, easy to interact with menu that lets you select the type of output device manually. With passthrough, it works flawlessly. However, the Xbox lack this capability, and will only give options for what the TV explicitly identifies as being capable of. The next issue is also with this so-called "passthrough." If the signal were being "passed through," there wouldn't be a second-long delay when using Atmos. Hooking the Xbox directly up to the A4A with HDMI output 1 (despite the incapability to do anything above 4K60/1080p120) yields perfect audio, with all options being enabled, and essentially zero audio latency. The TV should not add that much time and clearly, since it monkeys with the broadcasted capability, it is not actually passing through. This prevents me from using passthrough on the TV to circumvent the other issues. The TV has one last crippling issue that breaks key functionality: it reports different capabilities based on a predefined input list. Companies other than Sony have special tile icons for their devices on the TV menu, and each one goes along with a specific profile. This results in interesting behavior, where devices capable of more are restricted to what Samsung's profile thinks they can do. A great example comes from Switchroot Android. The latest update for the Android 10 version uses a device tree that identifies the Switch to other devices as in a Nintendo Switch (the previous versions showed as NVIDIA Shields). The Switch's stock firmware locks it to 1080p60, but the v1 is actually capable of outputting up to 4K30, and this capability is unlocked in Android and Linux. However, the TV sees a "Nintendo Switch" and assigns it the Switch profile. The Switch is then informed that the TV only supports up to 1080p60. Absolutely hilarious, and this bug has existed since at least the 2018ish models when Game Mode was new (haven't tested before that). That seems to be part of the issue with the first HDMI output on the A4A-that output is the only one that registers as the A4A and not the original host, and it likely is assuming the A4A is incapable of outputting anything better than 4K60/1080p120. It's also possible that the aforementioned A4A self-reporting issue is causing this behavior, but given I've seen this before I figured I'd mention it. Xbox Series X The big issue on this device is just that it takes whatever the output device says as law. On Windows, you can force your way over HDMI, be this by virtue of the GPU driver or some capability Windows has that Xbox doesn't expose. The audio passthrough issue seems to be worse on Xbox than other devices, but it's possible that it's just because my other devices don't support Dolby. PS5 The PS5 is able to override most of the bugs of the other devices, but has one weird quirk-when put through the A4A, while it seemingly can do proper video passthrough, We are prompted to redo HDR calibration every single time the device turns back on/is reconnected, even though the display information should be exactly the same. Additionally, though this isn't much of a bug and more of a missing feature, you can't properly control color space etc. for HDMI outputs on the PS5. Anyway, not sure if there's much I can do about this, but I highly doubt any of these companies care enough to help, so at least I get to rant. If anyone has found a solution, I'd love to hear it.
  4. Looks great--thanks! Time to suck up a little more ram!
  5. I've tried that--there's no good place. There's no like 3rd party software I can find? I might try that...it's kinda crappy tho so it might bug out
  6. I have an awesome ASUS ROG SWIFT P248Q 1440p 21:9 curved monitor as my main monitor, but I have an old ViewSonic 1080p monitor that I use for Spotify and chat while playing games, extra room, etc. This works great, except that while the physical heights are the same, the mouse movement and content movement from screen to screen differs--is there any way to stretch my desktop where that doesn't happen without lowering my main monitor res to 1080p?
  7. In the latest TechLinked episode, Riley was explaining the details of the awesome new HoloLense 2. However, he did err in that he said the OG was $5000, more than the 2’s $3500. A HoloLens enthusiast myself, I’m familiar with the pricing and know this is incorrect. The Commercials Suite of the OG (which came with a number of features like Kiosk Mode) was $5000, but the standard edition (Develper Edition) was the same model and was only $3000, making it cheaper than the new one. Hopefully the 2 will have the commercial features built in, or that commercial suite might be a bit more...unaffordable. Not that the standard is cheap ofc.
  8. What games? A lot of games, like PUBG for example, have rendering engines that just don't work well. Also, are you sure you're on the highest settings for shadows, foilage, textures, shadows, occlusion, etc.? If you have a Nvidia card, pls send the GeForce Experience page for it, otherwise ig just send the settings page in the game.
  9. Ik this topic is probably closed but just fyi, here's what my motherboard manual has to say: Your MB may vary--there's probably a similar page in your manual. Basically PCIe SSDs (like the one you have) should only interfere with PCIe slots--SATA should be fine. It runs over a different interface and takes up different bandwidth (PCIe lanes instead of SATA). I have a crappy MB--mine trashes an x8 slot if I use an M.2 drive (ig because each uses x4, but that's still annoying for just using one). You should be fine with whatever slot as long as your MB manual denotes that that slot is compatible with PCIe SSDs (because SATA ones use a slightly different key which is physically different). Sorry for the confusion, but basically from what I can tell you'll be fine with that top slot as mentioned above.
  10. M.2 slots can either draw from SATA or from PCIe--basically acting as either a SATA port or a PCIe 4x slot. M.2 SATA SSDs, like https://www.amazon.com/Blue-NAND-500GB-SSD-WDS500G2B0B/dp/B073SBX6TY/ref=sr_1_3?ie=UTF8&qid=1550455975&sr=8-3&keywords=SATA+m.2, use the SATA interface, while others, like 970 Evo mentioned by @Pm_me_nude_pc_parts, use PCIe interface, and therefore PCIe bandwidth INSTEAD of SATA bandwidth. Some slots use either one or the other (like he has), while some (like mine) can be used as either depending on the SSD being used. PCIe has a much higher bandwidth and maximum speed than SATA, so I recommend using the PCIe port instead. Unless, however, you have an SLI config or something that requires a lot of PCIe bandwidth, maybe SATA is a better option--sacrifice a small bit of storage performance for more graphical processing power or network speed or whatever your case may be. My suggestion is use the PCIe one so it doesn't bottleneck any more than it has to (other parts, OS, etc.).
  11. Idk what @Firewrath9 is saying, but that's an NVMe SSD and in order to take full advantage of the NVMe capabilities, you need to use the PCIe slot. Also, a lot of the SATA ones only work with certain keys (B vs. M), which can cause compatibility issues. My MB (MSI GAMING Z270 M3) has two slots, both capable of PCIe and SATA interfaces, but which of those you use can affect which SATA ports and PCIe slots you can use for other devices like SATA HDDs and graphics cards. To be clear, the speeds are comparable, and sometimes the OS or what you have running in the background limits the speed more, so it doesn't really matter.
  12. Interestingly enough, then keyboard problem stopped, at least for now... if it happens again I'll RMA it. It's a hassle, but I'd rather have a quality product than a broken one ofc
  13. I've recently purchased the ROG P348Q Monitor and the ROG Strix Flare Keyboard (Like $900 and $150 btw), and I have to say, I'm disappointed. About two months after using for the first time, the monitor has a dead pixel and the keyboard won't consistently hold keystrokes (i.e. trying to hold SHIFT to sprint in HL2 or holding E to revive my friend in Apex Legends). As a more "premium" brand, with prices to match, I'm kinda mad that this happened. I was initially very happy with both--the monitor is awesome, with one of the best LCD panels I've ever seen, and the keyboard made me love typing again, even more than when I got my Surface Laptop. support says there has to be 3-5 dead pixels to RMA the monitor, and I really don't want to have to send my keyboard back, especially when I'm sure the same thing will happen again. What was really interesting was the timing--I got the monitor a few days before the keyboard, and it got the dead pixel a few days before the keyboard started not working right. Anyone else have this issue/any solutions besides RMAing them or living with it? I've tried a stuck pixel remover, but I'm gonna run another all night tonight, but idk about the keyboard.
  14. Eh nvm I ended up just buying the B+. My power supply had been having problems but I continued using it...that probably fried the components. I ended up buying a new power supply as well.
  15. lspci -nnv returned that that command was not found lsusb returned Bus 001 Devices 001-004 Microsoft Corp. Wireless Optical Desktop 3.0 Standard Microsystems Corp. SMSC9512/9514 Fast Ethernet Adapter Standard Microsystems Corp. SMC9514 Hub Linux Foundation 2.0 root hub
×