Jump to content

chickennuggetstyle

Member
  • Posts

    45
  • Joined

  • Last visited

Everything posted by chickennuggetstyle

  1. Maybe this is the case on some really expensive calibrated LCD, but I've done side-by-side comparisons of those OLED test videos between my CRT and every other TV/monitor in my house (several of which costing $1000+). I've never seen any LCD come close. Some of my crappier CRT SDTVs only do full on/full of though admittedly. I prefer CRT smoothness to blocky LCD sharpness, but I can see how it could be annoying for some applications. Still, try running an LCD at any resolution besides the native- you won't be impressed with the clarity. I don't doubt that LCDs are better for word processing, but I'd also be curious what "good CRT"s you've used. Good ones can checkerboard without any noticeable lightening of the black levels. Color accuracy, response time, motion clarity, and once again: black levels. Someone who writes word documents with their taskbar visible the entire day is probably not going to care about these things, but they're all objectively better on CRT. I'm not saying everyone should go buy a CRT (I already hate how difficult they are to get ahold of) but they produce superior images. People tend to really hate the idea that they don't have "the best thing", because for some reason everyone wants to think of themselves as 100% pragmatic and objective. I think that's why everyone wants to remember CRTs so unfondly- because we ditched them for a more convenient product which, at the time, was objectively far inferior (seriously, try to use an LCD from 2006). But there's no shame in discarding an inconvenient product for a more convenient one, especially if the pros of the inconvenient product seem insignificant. What I can't get behind is when people ignore the limitations of the technology they use, just because it feels shitty to not have "the best thing". Especially when I hear the shoddy justification "everyone else is using it". Display manufacturers are plenty complacent as it is. One guy a few comments up apparently has an adapter with decent bandwidth. ...which is exactly why I won't be getting a 980Ti. But your insight is appreciated regardless.
  2. IMO this is a complete misconception. I've seen a lot of "high end" LCDs at this point, and would have agreed with you not-so-long-ago. But I got ahold of my first high end CRT just a few weeks ago, and haven't looked back. The "make better contrast by make brighter backlight" strategy by LCD manufacturers doesn't do it for me. Dark scenes in low light still look like crap on any LCD that doesn't have some crazy number of dimming zones (which cause crazy blooming and often just look like shit). 1440p is more than enough resolution for me at computer monitor sizes, and flicker isn't noticeable after 85hz. The one thing I'll concede to you is burn-in, but how hard is it to just turn on a screen saver? Or just put your computer to sleep if you're not using it? I don't mean to be inflammatory here, but CRT is just a superior technology for overall picture quality (in dark environments). Size and convenience are obviously massive drawbacks, but people who think consumer-level LCDs are ever gonna come close to "catching up" in picture clearly haven't yet seen a good CRT. And people who have and still prefer their LCD obviously have higher priorities than getting the absolute highest image quality achievable. I'm just a bit of a budget videophile
  3. The CRT I'm running does better resolutions and framerates than my 2017 gaming laptop's LCD. That gaming laptop has a GTX 1070 in it, which still doesn't run as smoothly as I'd like. Low public opinion of CRTs disappoints me once again But thanks for the help anyways.
  4. I'm using the official Apple adapter, but didn't read the fine print (it only does 1080p60). All I need is 1600x1200@85hz. Mind linking me to whatever it is you're using?
  5. I think I need to get some sleep. Seems like I'm struggling hard to get this point across in writing, lol. I know that DVI-I is analog, but modern cards don't have DVI-I. Only pre-Pascal stuff does. I'd like to be able to harness the power of newer architecture without losing analog capability, so my question was about having a secondary GPU in the system (one that actually has DVI-I) which should just work as a DAC. Sorry for my lack of clarity. Going to edit my post now.
  6. Well the issue is actually hooking it up to the monitor, which only takes VGA.
  7. I haven't owned a desktop for 3 or 4 years now, but I think I finally hate my laptop enough to make the change. I only ever use the thing at home plugged in to an external display anyways, and it's frankly ugly as all hell, so I'm not gonna be sad to see it go. Hoping to find someone out there who'll swap me for a desktop with comparable specs, but maybe that's wishful thinking. Anyways, the issue I'm posting about is with my CRT monitor (no, I won't get a new monitor), which importantly only takes analog video in. My DisplayPort->VGA adapter hasn't ever had enough bandwidth for me to simultaneously run the monitor at its max resolution and max refresh simultaneously, even though I know it's capable. But I have heard that one can use an older GPU (With DVI-I) as a sort of passthrough/glorified video DAC. It seems like the obvious solution for me is to stick an old Maxwell card in the build I'm planning. Problem is that I can't find a tutorial for setting the configuration up anywhere online, but I think it should be relatively similar to what LMG did in their "Nvidia Said We Couldn't Game On This Crypto Mining Card..." video. I really don't want to have to buy yet another dongly adapter (and risk it also not having enough bandwidth) if this much cleaner solution is an option. EDIT: I can't write. To rephrase all this more clearly: I like my CRT, but it needs native analog video in. Modern cards have jaw-dropping processing capabilities, but no native analog video out. Older cards have native analog video out. I'd like to put one new card and one old card in my upcoming system build, configured so that the older card can work as an overqualified DAC and display output. Videos like this indicate that it should be possible. Anyone know if it is, and if so, how to do it? Any help or insight would be strongly appreciated.
  8. $80, nice. I remember posting on here a few years back (on my old acc) to boast like a total asshole about finding a 980 Ti for under $400. I can still feel the excitement.
  9. Any chance you could explain the difference to me between an active and a "standard" adapter? I assume all of them would require external power, right (given that there is a digital->analog conversion occurring)?
  10. Damn, so I really am fucked. Guess I'll just have to live with... latency
  11. I guess if you really wanted to, you could get a bunch of other parts and make a rudimentary PC cluster/"supercomputer". I don't know much about em tho, and even ignoring the severe OS limitations, performance would probably suck compared to just selling the parts and upgrading your 1 system as a whole.
  12. Planning on hooking up an analog TV to a sort of gaming/home theater PC. Unfortunately, as far as I can tell, the age of DVI on GPUs is over. Just wondering if there's anything I've missed here, or if my 2 best options really are adapters and used cards :(
  13. Hadn't seen the master 3 before... beautiful thing it is. think I'll go for that. Thank you.
  14. is the mx master that bad for gaming? i do play some occasional BFV, Minecraft, and GTA V so I can't say I don't plan to game at all...
  15. Thanks for the recommendations! I am particularly interested in finding something wireless a with a really fresh, sophisticated looking aesthetic (to compliment my lack thereof), so I don't think any of these will be for me (unless I decide to just get a new MX Master ), but I appreciate it nonetheless ?
  16. I've used the same Logitech MX Master for a little under 3 years now, and its finally conked out. I was a bit of a dick to this mouse and accidentally dropped it many times, so that's a lesson learned. Can't turn the thing on anymore without it constantly sidescrolling which makes me feel very sad and very stupider. I'm sure the market has changed since I bought it tho, so I'd love to hear from y'all what the best replacement would be today. I see a lot of these black plastic "gaming mice" with some big colorful light inside that can't be set to white, but I'm personally not a fan of the aesthetic (just a personal preference; obviously i have no objective issue with it or anyone who uses it). Is there anything similarly or more stylish and minimalistic than the MX master with equally good function/reliability? I'm a big fan of silver if that helps. Thanks in advance for the help, -Joshua :)
  17. Thanks for this! Will any games take advantage of more than 8GB RAM yet? I've been completely out of the loop in PC building for over a year, so I haven't caught up with the whole Ryzen thing yet. I assume it just takes extra advantage of RAM speed and capacity, making RAM worth more to the build?
  18. https://pcpartpicker.com/list/3jbqf8 Its only going to be used for gaming. It doesn't need upgradability, OC support, or anything like that. Just trying to come up with a fairly long lasting gaming build.
  19. Usually, one doesn't comment on posts over 1 day old... Fascinating.
  20. So about 6 months ago, I was really active on this forum, but I kinda stopped going here. I didn't remember this forum ever being so insanely flooded with posts, but it doesn't seem like the channel had any ridiculous sub boost to constitute this... Am I just insane?
  21. I appreciate the help, but I live in a family of people who only do things if there's something in it for them. I'd best start a new thread on bribery tips.
  22. My house is fairly big, as I live in a large family, and since the house is being renovated, there are piles of shit all over the place that I can't look through all of, especially if I'm not even sure anything's in there. I went on a trip to sweden one day, and came back the next to find that it was gone, and I've looked quite a bit, so I need to get a lead on where it might be.
×