Jump to content

tam082

Member
  • Posts

    3
  • Joined

  • Last visited

Awards

This user doesn't have any awards

tam082's Achievements

  1. In anyone is interested in some actual info relating to overclocking a monitor & bandwidth requirements, this article on TFT central is excellent: http://www.tftcentral.co.uk/articles/overclocked_refresh.htm Now that the GTX1080/70 are being released with DisplayPort 1.3 & 1.4 support, hopefully we'll see the release of some new monitors that can take advantage of it
  2. The technology exists today for DisplayPort /Thunderbolt 3 via USB-C. The spec sheet (from 2015) for SuperMHL looks amazing: 8K video signals at 120 frames per second, as well as power delivery (see Dell UP3017Q). Can't wait until this is mainstream & makes overpriced HDMI cables seem even more ridiculous than they already do. What's more - I'm sure those cables will cost a more like $20. If the demand is there I guess someone will introduce a "premium" line for those that like to spend extra Actually, just thinking about it - maybe I'm not looking in the right (or wrong) places, but I can't say I've ever come across a display port cable that costs much over $25. Huh... maybe it's an untapped market!
  3. Useless Tech Over $100 Ep. 1 could be an interesting series (effective as a PSA to stop people being robbed blind) Completely agree with the conclusion, but I'm completely confused by the use of the monitor overclock test ... I came away from this video felling like I know less than before I'd watched it My understanding is that it is mainly limited by the monitors own controller/firmware and scalers. In the past I have experimented with overclocking different displays with mixed results: Dell 27" IPS @ 92Hz (works over DVI, artifacts with DP despite higher theoretical bitrate?), Samsung 60" LED TV @ 72Hz. My 34" LG Ultrawide (34UC87C) will not go over 60Hz, and I haven't seen any evidence of any one else with a successful OC. Yet the same panel in the Dell U3415W will go to 75Hz. The limitations have nothing to do with the cable manufacturer or materials in those cases. Maybe I'm missing something, but I don't understand the methodology used in the test and how it relates to the cable specifically (genuine question)??? Why do some monitors OC better than others& does frame skipping mean a lot of DIY OCs do more harm than good?? There seems to be a lot of misinformation & manufacturers (understandably) don't really support overclocking if it's not applied through their own menu (like the ROG Swift). I'm open to the possibility (happy to be shown otherwise) that the cable can make a difference in some fringe cases (I have spent a little extra on RedMere HDMI cables & VESA certified Displayport Cables), but I'm not sure if the test in the video was necessarily one of them. You probably don't want to get TOO technical or make a 45 minute video, but I'd personally be a lot more interested in finding out - Will your "el cheapo" HDMI cable work with HDMI 2.0a? - Is there any use case where spending extra yields any benefit (Ethernet over HDMI/ARC/ long cable runs)? - It it possible to saturate the bandwith of *ANY* HDMI cable, regardless of cost? The changes in the display space seem to be moving at a faster rate than a lot of other tech. You could make a follow up video with some more in-depth coverage (Techquickie? or not) on the technology behind the new 165 / 200Hz monitors, HDR and associated bandwidth demands (4K+ /60Hz+ /4:4:0+?). Freesync over HDMI (first introduced independant of VESA) is pretty badass. Fingers crossed the new DisplayPort 1.3 standard will change things up once the new generation of GPUs are released. The industry needs a single universal standard like adaptive sync to replace G/free sync long term. Looking at the length of my post, I'm probably overthinking it - can just see so much more potential there
×