Jump to content

Glenwing

Senior Moderator
  • Posts

    17,129
  • Joined

Everything posted by Glenwing

  1. Check what ports your GPU has, check the pinned thread to see your options. https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/
  2. An adapter that says "active adapter" in the product title or description.
  3. Active adapters will work.
  4. Hi, please start a new thread, do not hijack someone else's thread for your question
  5. It's just my own observation based on experience, not something I read on the internet. Generally retail product listings (and even datasheets and manuals) can't be relied upon just by the fact that they label something as "Dual-Link DVI", because many there is a widespread belief that you can distinguish single-link from dual-link f by looking at the connector, which is not true. It seems there are some EVGA GT 710s that have two DVI ports and EVGA distinguishes between single-link and dual-link, so based on that, I trust that dual-link is indeed available on the GT 710. https://www.evga.com/Products/Specs/GPU.aspx?pn=F345C5EC-D00D-4818-9B43-79885E8A161F It says the maximum for DVI is 2560×1600, not 1920×1200. The limit of 1920×1200 at 60 Hz is only for single-link mode.
  6. DVI ports on low-end cards are often single-link only (GT 1030 comes to mind). There's nothing in there about a limit of 60 Hz. If you mean where it says "Maximum resolution: 2560×1600 at 60 Hz", that just means that's the maximum resolution at that refresh rate. If you use a different refresh rate like 144 Hz then the maximum resolution will be different.
  7. It seems to be a reasonably common occurrence: https://linustechtips.com/topic/1509589-why-am-i-loosing-my-4th-monitor-to-dsc/ https://forums.tomshardware.com/threads/triple-samsung-ls28bg702epxen-monitors-are-stuck-on-120hz.3819333/ https://forums.tomshardware.com/threads/cant-connect-4-monitors-to-a-3090ti-gpu.3814708/#post-23056965
  8. https://linustechtips.com/topic/1046820-why-is-my-display-not-75hz/
  9. NVIDIA does not support the FreeSync on HDMI protocol.
  10. Yes, it's incorrect. First of all, "Max resolution" doesn't actually exist as a restriction. GPUs have no restrictictions on the maximum resolution they will support except for those inherent to the protocol, such as a hard limit of 4095 in each dimension for old cards/drivers that only support legacy EDID or things like that. This used to be easier to prove when "max resolution" specs were lower, for example Fermi cards like the GTX 580 listed "max resolution 2560×1600" but you can run 3840×2160 on it just fine (at 30 Hz due to bandwidth limitations of the connections it supports, but hey). On Kepler the "maximum resolution" was 4096×2160 but people ran 5K monitors on it just fine, and indeed, multiple 4K monitors (you see, the "max resolution" is not reached by adding up all your displays). The "maximum resolution" spec was invented to give consumers confidence that their display would be supported. Back when 2560×1600 was the highest available resolution in a monitor, they liked to put "Maximum resolution: 2560×1600". But it's not meant as a "it can't go beyond 2560×1600", but more like "Yes it supports even those 2560×1600 monitors". These days, they generally just pick the highest standard 16:9 resolution that can be supported at 60 Hz, and list that as the "maximum resolution", but you can go higher if you lower the refresh rate, enable chroma subsampling or DSC etc., and it certainly doesn't mean that multiple displays are "added up" and checked against this number. With modern cards having "maximum resolution" of 7680×4320 it's a bit more difficult to demonstrate from home than it used to be, but you can take it from NVIDIA instead. This was for RTX 30 series: https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/ampere/pdf/NVIDIA-ampere-GA102-GPU-Architecture-Whitepaper-V1.pdf As you can see, there is no "all your displays must add up to less than 7680×4320" limitation. You can run two 7680×4320 displays, one per port. The limit of two comes from the fact that DSC is required for 8K 60 Hz. https://nvidia.custhelp.com/app/answers/detail/a_id/5338/~/using-high-resolution%2Frefresh-rate-displays-with-vesa-display-stream
  11. Microsoft PowerToys can also do this (see the "Mouse Without Borders" section) https://learn.microsoft.com/en-us/windows/powertoys/
  12. This is incorrect, the "max resolution" spec on a graphics card is not added up between ports, every port is independent. The issue is the GeForce cards only support 4 displays, and displays using DSC count as 2. The G9 at full resolution/refresh rate would require DSC.
  13. It does make sense in an enterprise/datacenter context, which is why large datacenters already use this kind of architecture. It's just in the home that the gains are small and the inconvenience is large, so it's not really worth it to most consumers.
  14. 32 bit is the Windows desktop color depth, the value is always 32 and does not affect transmission to the display.
  15. No, it's not a 120 Hz monitor.
  16. Windows doesn't show size, it only shows resolution.
  17. Does it have a model number on the tag? If it's an XL2411, this model is limited to 60 Hz over HDMI.
  18. It's sorted out, some people just don't get it, that's all. If you want to be specific, say 3840×2160. If you want to use a shorthand, call it 4K UHD. The convention of calling resolutions a shorthand like "2160p" refers to any resolution where the second number is 2160. So it could refer to 3840×2160 or 5120×2160 or anything else. If you say "2160p" by itself, people will usually assume you mean 2160p with a 16:9 aspect ratio, which is 3840×2160. If you want to specify a different one, include an aspect ratio, like "2160p 21:9" which would be 5120×2160. The convention of calling resolutions a shorthand like "4K" refers to any resolution where the first number is around 4000. So it could refer to 3840×2160 or 4096×2160 or 3840×1600 or 4096×3072 or any number of other resolutions. Works the same as above, except the number is now rounded to make a shorter shorthand. Don't know why this confuses people so much, but it does. Again, if you just say "4K" with no other context, usually it's assumed that you mean 4K width in a 16:9 aspect ratio, so 3840×2160. If you want to refer to a different one, include an aspect ratio, like "4K 21:9". Or just write out the resolution, 3840×2160, and there is no possibility for confusion. Ah. Well, the OP's original post was asking how the "K" naming convention works, with no specific context, and your explanation began with how the DCI standard defined 2K and 4K from which the practice of using 2K and 4K as shorthands for 1920×1080 and 3840×2160 apparently derives from. The implication, to me, was that you believed DCI was the original source of these names, and that the DCI definitions of "2K" and "4K" were the "true meaning of 2K/4K" as it were, which is a common misconception. Hopefully that clears it up.
  19. Looks like part of an input filter from the battery. It's connected to the battery positive (through PF201) and the metal piece is likely connected to ground, so when you touch it to the inductor you are basically putting a direct short across the battery terminals. Since there are two others in parallel, maybe can just remove the inductor and it might still work. Just the filter value will change a bit. But might still work fine. Should also check the fuse (PF201), as it might have blown.
  20. If you set a custom resolution you need to set the timing standard to CVT Reduced Blank, don't leave it on automatic.
  21. 1080p as in 1920 × 1080? If you're using 2560 × 1080 then 100 Hz will be about the limit for a 340 MHz HDMI output like on the 750 Ti.
  22. This itself also hints at some minor misconceptions. 4K and 2K are generic terms. They are adjectives, not names. DCI did not invent the terms 2K and 4K, and the use of these terms long predates the DCI standards. As such, the idea that the DCI standard defines what the terms 4K and 2K mean is not correct. 4096 × 2160 is an example of a 4K resolution that has been standardized by DCI. Another standardized 4K resolution is 3840 × 2160, in ITU-R BT.2020. Again, 4K and 8K are adjectives. Which UHDTV system? The 4K one. The 8K one. Let's put it this way. I'm defining a new standard right now. My standard defines two video formats: a 16:9 format and a 21:9 format. The 16:9 format has a resolution of 1600 × 900 and the 21:9 format has a resolution of 2100 × 900. Then, there will surely be some people on the internet who read this sentence: and then say "Look, 1920 × 1080 isn't really 16:9! See, true 16:9 is defined as 1600 × 900! See, this standard right here establishes the official definition of The 16:9 Video Format!", failing to realize that "16:9" isn't being used as a name here, just a description. In the same way, the DCI standard establishes terms and definitions to use as shorthands within the scope of the document, the way that legal documents work. But it's not intended to establish "2K" and "4K" as exclusive names for the formats it defines. They're just descriptions used within the document. Like I said, the use of these terms is generic and long predates DCI. A 4K scan of 35 mm film will be around 4096 ×3112 or something like that. A 4K cinema crop will be like 4096 × 1728 or whatever. 4096 × 2160 is one particular 4K resolution that some people have standardized around for certain purposes. 3840 × 2160 is another 4K resolution that people have standardized around for some other purposes.
  23. The selection of the resolution isn't the coincidence, I'm just talking about the name having "4" in it while being 4 times as many pixels as 1080p. That's not where the "4" in "4K" comes from, it's just a coincidence.
  24. Really it all just stems from people noticing that 4K has four times as many pixels as 1080p, saying "ohhh I see the pattern!" (which is totally how patterns work right, you can definitely identify patterns by looking at a sample of 1) and then saying "well then if that's the case 1080p would be 1K and 1440p would be 2K since it's 2x as many pixels as 1080p, that makes sense, and if something makes sense, that proves that it's true, so I should definitely not try to check at all, and instead start educating other people on the internet about my un-checked assumption about how the system works, but in an expert tone without disclosing that's it's an assumption I haven't checked at all!". And unfortunately when it comes to terminology once you get it up and running you can get this sort of perpetual motion machine going where everyone just says "well I just call it that because everyone else calls it that". The really sad thing is that the standardization of 4K and 8K UHD resolutions were both announced at the same time, so anyone could have just tried to do a basic confirmation "well if my assumption about where the name "4K" comes from is correct, then 8K should be 8x 1080p, let's check if that's true or not" and then immediately seen that the "pattern" doesn't hold and that the whole "4K is 4x 1080p" thing is just a coincidence.
×