Jump to content

Glenwing

Senior Moderator
  • Posts

    17,122
  • Joined

Posts posted by Glenwing

  1. 2 hours ago, MrSimplicity said:

    No, it better be an LED monitor. If it isn't then that's some really bad false advertising. 

    The product page makes it pretty clear it's an LED-backlit LCD (IPS-type).

     

    LED-backlit monitors have been commonly labeled as "LED monitors" for over a decade.

  2. 1 hour ago, Eigenvektor said:

    From what I'm able to find the cable should be able to do 32.4 Gbps (which would be HBR3).

     

    1440p, 10 bpc, 240 Hz requires 30.77 Gbps, so the cable should be up for it. Maybe it was defective or the specs are plain wrong, but yeah you'll need a cable that supports HBR3 (cables are not versioned, they are rated by the maximum bandwidth they support, even if the version is often used for marketing purposes)

    HBR3 maximum data rate is 25.92 Gbit/s, which is a physical bit rate of 32.4 Gbit/s.

     

    Requirement for 2560×1440 at 240 Hz 10 bpc is 30.77 Gbit/s data rate, which becomes 38.46 Gbit/s bit rate in HBR3 transmission. So it would be outside the limit.

     

    But you are comparing the data rate requirement of one against the bit rate limit of the other, which is incorrect 🙂

     

    If the monitor only supports HBR3, then either it uses DSC, or does not allow 240 Hz and 10 bpc simultaneously. In either case only HBR3 rated cable is needed, anyway.

  3. 9 hours ago, Shimejii said:

    Are you sure about that? Since HDMI 2.0 is limited to 144hz. If that laptop has HDMI 2.1 then its technically possible but even then they tend to be just limited to 144hz. Are  you sure they arent using a USB -C Port?

    There are no specific refresh rate limitations on any interface.

     

    HDMI is limited to 144 Hz at 1440p due to its maximum bit rate, if 8 bpc RGB color is used. If the monitor is 1080p, or any of those other things change, the maximum refresh frequency will change.

  4. DVD Video is 720×480 at 29.970 fps for NTSC format or 720×576 at 25 fps for PAL format. TVs use 59.94 Hz, and DVD players transmit each frame twice.

     

    For sources originating from 23.976 fps film, 3:2 pulldown is used to convert it to 29.970 fps for DVD Video. So this is already built into the images on the DVD.

     

    Almost any monitor will already have a CTA-861 profile that runs at 59.94 Hz. Using CVT will not allow you get such an exact rate due to the requirement that the pixel rate be an even multiple of 250 kpx/s.

     

    It may be more worth it to just make a custom resolution at 1920×960 (that is, 480×2 lines) then display it with aspect ratio scaling. You'll get thin black bars on top and bottom but then the 480-line image will be scaled 2:1 precisely (if you set scaling to maintain aspect ratio). And it can be based on the standard CTA-861 modeline.

     

    $ xrandr --newmode "1920x960x60"  148.3515  1920 2008 2052 2200  960 964 969 1125 +hsync +vsync

     

  5. On 3/29/2024 at 12:23 PM, dinkostinko said:

    GPU Scaling has been turned off the entire time, yet text is still visibly blurrier on Display 2 than it is on Display 1.

     

    I unfortunately have to use HDMI as the USB-C port does not have the bandwidth to support two 1440p monitors.

     

    Turning on GPU Scaling defaults the resolution to 4K, and limits me to 30hz.  How do I make it so I am outputting a 1440p signal over HDMI just like I am over Displayport?

    Sounds like AMD is not implementing something correctly in their driver.

     

    Download the Custom Resolution Utility here: https://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU

     

    Select the 1440p monitor at the top, then select the CTA-861 extension block and click "Edit":

    2024-03-30 #763 (498×483).png

     

    Select the TV Resolutions data block:

    2024-03-30 #765 (348×431).png

     

    Delete all the 4K entries:

    2024-03-30 #767 (378×337).png

     

    Click OK on everything, then restart the graphics driver using Restart64.exe

    2024-03-30 #768 (135×105).png

     

    Then the 4K profile should be gone, and it should use 1440p.

  6. Some 1440p monitors accept 4K resolution input signals on HDMI. This is for compatibility with older consoles (PS4, Xbox One) which only let you choose between 1080p and 4K. So on 1440p monitors they would be forced to a 1080p limit. To avoid this, monitors accept a 4K signal and then downscale internally to 1440p.

     

    If you have GPU Scaling enabled in the AMD Adrenaline settings, the graphics card will always output a signal at the monitor's maximum resolution regardless of what resolution you set the desktop to. If you choose a lower resolution, it will upscale it to the monitor's maximum resolution before transmitting. Unfortunately this combines in a bad way with the 1440p monitors that accept a 4K signal, as it just takes your 1440p original image and upscales it to 4K, transmits it, and then it's downscaled back to 1440p on the other side, resulting in quality loss from the double conversions, and also limiting you to 60 Hz because of the increased resolution during transmission.

     

    To avoid this, turn off GPU Scaling in the graphics card control panel, or don't use HDMI.

  7. 4 minutes ago, MadManMoe said:

    Oh, man. Totally missed that part. I got it a few years ago, so that's why it worked so well until I got the G9 now. Great catch, that makes perfect sense.

    If you have any recommended KVMs that can support this monster, I would appreciate it. Either way, many thanks!

    The L1Techs model may work, as it supports Display Stream Compression which is what's needed for 5K 240 Hz to work. Not tested personally though.

     

    https://www.store.level1techs.com/products/p/14-kvm-switch-single-monitor-2computer-64pfg-7l6da

  8. 1 hour ago, MadManMoe said:

    I hate long posts, so I'll try and keep this short (prefer replies from those who have experience with the G9).

     

    Current Setup:

    • (For Work):
      • Model: Lenovo Thinkpad T14 running
      • CPU: AMD Ryzen 7 Pro 5850U
      • GPU: Dedicated Radeon Graphics
      • OS: Windows 11 Pro
    • (For Home):
    • CKL 2 Port KVM switch 
    • Samsung Odyssey G9 OLED Monitor (Model: LS49CG954SNXZA)

    Previously, I've had two Alienware 27-inch 1080p monitors that worked without a problem...until i replaced them with the G9.

     

    the TL;DR of it all is I spent hours mixing...altering...moving cables and setups and wires around, until I went flat out cave-man and connected my work laptop using HDMI and my home PC with the included DP cables...but now...my KVM is basically an overpriced KM instead of a KVM switch. 

     

    My questions are:

    1. is the G9 so finicky and so sensitive, that it just refuses to cooperate with most KVMs (not enough resources out there, but seems to be an issue)?
    2. My home machine is running Pop OS. Direct connection using the included 8K DP cable, I get at best 120Hz with full resolution. @ 240Hz the screen goes mad and flickers on and off every 2 seconds. is this an OS issue or monitor issue or PC issue? (Laptop runs fine at 120Hz using HDMI which is perfectly acceptable for me).
    3. Rumor has it the G9 has a built-in USB switch/hub. Nothing in the included documentation says anything about it. Anyone know anything about it?
    4. Finally: Do I just need to rip out all my wires cables and replace each one with an 8K certified cable? Is that really my issue?

    Thanks in advance Fellow tech guys.

    I should note on the product page of the KVM it says:

    Quote

    Special Reminder on 5K resolution:  This KVM switch does not work for 5K (5120x1440 or 5120x2880) over 100Hz, recommended is 50/60Hz. Please be aware of this before order.

     

  9. 4 minutes ago, emosun said:

    ah ok. does the monitor have no way to switch between those inputs or?

    That's what he's doing right now. But he'd like the button on his KVM to switch K, V, and M. Not just K and M and then switch the video separately.

    1 hour ago, MadManMoe said:

    until I went flat out cave-man and connected my work laptop using HDMI and my home PC with the included DP cables...but now...my KVM is basically an overpriced KM instead of a KVM switch.

     

  10. 5 hours ago, 420WeedJesus said:

    Do you think something like a display port repeater will fix this issue so I don't have to get a whole new setup for it? After doing some reasearch I just found out about these and it seems like it could work.

    Something like this: https://www.amazon.com/gp/product/B07D949F59/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&psc=1

    That could probably work. Since it uses an external power supply, it shouldn't rely on the 3.3 V from the port.

  11. 12 minutes ago, omultphy said:
    I am buying IdeaPad Slim 5 from lenevo that has hdmi1.4b and two usb 3.2 gen 1 (full function) ports. Can I get a 4K 60 Hz output from this, if I was to connect a 4K monitor? What about 2K and higher refresh rates such as 120 Hz?
     
    thanks 
     

    It supports whatever it supports, the video capabilities are not related to the USB generation. It's a direct output from the GPU so it will just be based on the DisplayPort capabilities of the GPU.

     

    https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/?output=USBC

     

    Based on what I can find about the AMD 7730U, it should support the formats you mentioned.

     

    https://www.amd.com/en/products/apu/amd-ryzen-7-7730u

     

    4K 60 Hz and 1440p 120 Hz only require HBR2 speed which has been doable with DisplayPort for around 10+ years.

  12. Adapters don't have specific support for "MST"; that doesn't exist. But adapter support on MST outputs will depend on the device. Some work with both passive and active adapters, some only work with active adapters, and some don't work with any adapters. If you've tried passive adapters already then you should try an active adapter.

  13. 3 hours ago, Thaldor said:

    The couple monitors I have seen with FRC have been like that. I must say I haven't dug enough there since I decided to go directly native 10 bit supports so didn't need to look deeper. But I would guess it's about that the panel itself can do max. 170Hz so the FRC must also fit into that, which means the "refresh rate" of 10 bit content must be slower than 170Hz to make time to fit that flickering between two colors within the refresh rate of the panel.

    Much easier to explain what I mean with 3D TV's and glasses. The 3D content requires that per one frame of the content the TV must show 1 frame per eye to create the 3D effect, this means if the TV has a panel that has 120Hz refresh rate (120 frames per second) it can only do 60 "fps" 3D content, it is still doing 120Hz/fps but the content must run half of that because the frames are divided between eyes.

     

    Same with FRC, the FRC must flicker two 8 bit colors to recreate a 10 bit color, this will eat the rate at which the monitor can show content because it needs to show more frames per frame of the content to recreate the 10 bit colors. The system exactly doesn't know the monitor uses FRC so the monitor must slow down the rate at which the system sends content so the monitor has time to show the content with FRC, so it just tells the system the refresh rate is lower.

    It is because a DisplayPort HBR2 transmission is limited such that 170 Hz and 10 bpc are not possible at the same time at 1440p.

     

    It is a straightforward bandwidth limitation and is not related to how 10 bpc color depth is implemented (FRC or not). Since the monitor advertises "DP 1.4" hopefully this means it supports HBR3 speed, so this would not apply. That is what the discussion above was about.

     

    Panels are addressed in a matrix format where only 1 row of pixels is actually connected to the controller at any given time. The controller scans through every row in order over the course of 1 refresh cycle. It is not possible to go back and change the color of a pixel twice during one refresh cycle as this would simply be the same has having a twice as high refresh rate, and this capability would not be wasted on simply implementing FRC.

     

    FRC operates by changing the color on alternating refresh cycles, not within a single cycle.

     

    My suspicion is that the monitor only supports HBR2 speed, and when ViewSonic says it has "DP 1.4" they just mean it supports HDR or something else. I should note the user manual states only "DP 1.2".

     

    image.png

  14. 7 hours ago, Thaldor said:

    Because your monitor only supports 8 bit + FRC, so it's either "10 bit" at 120Hz OR 8 bit at 170Hz.

    Directly from the ViewSonic

    image.png.524924f2f5874430232c10d20770e37a.png

     

    There isn't any loop around that because the monitor itself is acting as HDR monitor and taking that 120Hz 10 bit signal and then changing it to 8 bit + FRC at that 120Hz, which is probably close to normal 8 bit at 170Hz because FRC is basicly just flickering two 8 bit colors to mimic 10 bit color.

    That makes no sense, the use of FRC is not related to refresh rate, nor does the system know whether the monitor uses FRC or not; it just sees a 10 bpc display.

  15. On 2/23/2024 at 7:33 AM, Gege-Brazorf said:

    Since it won’t work I feel kinda bad, thank you for trying to help me, thank you.

    Is there anyway two connect two monitors except trought mst ?

    Yes, you can plug each one into a separate DisplayPort output on the graphics card.

×