Jump to content

CoolJosh3k

Member
  • Posts

    165
  • Joined

  • Last visited

Awards

This user doesn't have any awards

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Summary As I understand it, Lamptron is mass reselling the AIDA64 software as the core component of their products, but they never actually bought the keys they are “reselling”. A discovery that is causing issues for many distributors now they know about it and maybe customers too. Quotes My thoughts Based on what was said in the video, it sounds like everyone who has bought and used on of the many Lamptron products could face legal issues if they now learn of this, or simply figured it out on their own. I find it very interesting that a company can keep doing such an illegal activity at such a scale for so long, while putting anyone they distribute through at major risk of legal trouble. Sources
  2. Do you (or anyone else) know if it counts towards the 3 stream limit of the NVENC chip? Or it is a part of the card dedicated to just doing DSC?
  3. Come to think of it, what part of the graphics card handles the compression?
  4. I worry about if DSC really is visually lossless based on "when all the observers fail to correctly identify the reference image more than 75% of the trials". from https://en.wikipedia.org/wiki/Display_Stream_Compression Also I hoped that since “DSC can work in constant or variable bitrate mode.” Also from https://en.wikipedia.org/wiki/Display_Stream_Compression meant that it would use the higher bitrate of the HDMI for better results. Maybe I am just getting too technical here?
  5. That is why I am here actually. Since it has to use DSC, but HDMI 2.1 has more bandwidth, would it give a better image over HDMI?
  6. Maybe with DSC? Which means we’d be better off using HDMI? Would the same amount of compression be used?
  7. I notice the new MPG 321URX QD-OLED Gen 3 monitor from MSI coming soon says it has only 1 DP 1.4a, but also says it will do 4k240 via Display Port. hHow is this possible? https://www.msi.com/Monitor/MPG-321URX-QD-OLED/Specification
  8. Neither of these are 4K240 OLED, let alone the new generation talked about in the video?
  9. Does the software for preventing burn-in on OLED monitors require Windows, or is it all managed entirely on the monitor? For example, could I have a Switch console connected to an OLED monitor for the entire life of that monitor and be fine? What about cases where the software works without Windows, but a Windows PC is still needed to configure (and maybe even setup) the anti burn-in software on occasion?
  10. So will these be 34”? Or possibly smaller? Did they mention when the first of these monitors will be available?
  11. Have any videos so far had mistakes addressed? I don’t watch enough LTT and I am curious if LTT is actually doing what they said they’d do.
  12. So I got interested enough to find the source they showed: https://hifigo.com/blogs/guide/what-is-digital-jitter-and-how-to-avoid-jitter-dac-101-part-3 Now I think LTT might have goofed in what they said. Some things about that source: * “Basically, sometimes a sound that’s really really high in frequency like a cymbal shimmer, harmonic, or other high note will have this strange warbling or oscillating sound that wasn’t in the original recording. What’s happening is that the DAC is accidentally creating a lower frequency note because the signal is just close enough in frequency to the sample rate—of the samples are taken at inaccurate times by an older, crappier clock mechanism.” - If the note was close to the sample rate, yet within the limit of human hearing (20kHz) and you’d have other issues to worry about. * Then there is this: “How do you avoid Jitter? Increase the sample rate, of course! … Essentially, you can eliminate this issue if you’re able to sample at least twice per period, thereby forcing sampling errors to exist only in the highest frequencies that you’d likely be unable to hear anyway. Considering that the uppermost limits of human hearing range from 12-22kHz, doubling that rate nets you somewhere within 24-44 thousand samples per second, or 44kHz.” So they already know the expected minimum sample rate already fixed it. So LTT said jitter was actually a thing, while displaying an article explaining how it isn’t a thing??? Maybe they just meant very, very old hardware before CDs were a thing? Maybe someone who knows more can weigh in?
  13. Sounds like you know more than me. I wonder if LTT should be doing a video correction for this? I believe there is plenty of coverage on how digital to analogue works around the place, like @12:39 of this video I recall and checked just now, which alone makes me think you’d need serious amounts of jitter to make any difference at all:
×