Jump to content

chickennuggetstyle

Member
  • Posts

    45
  • Joined

  • Last visited

Posts posted by chickennuggetstyle

  1. 6 hours ago, Electronics Wizardy said:

    Even without playing with backlights, lcds are much better than crts, crts only get about a 100-300:1 contrast ratio with a checkboard pattern or simmilar, while lcds can get about 1000:1. They do well in full on vs full off, but thats a pretty unrealistic use case.

    Maybe this is the case on some really expensive calibrated LCD, but I've done side-by-side comparisons of those OLED test videos between my CRT and every other TV/monitor in my house (several of which costing $1000+). I've never seen any LCD come close. Some of my crappier CRT SDTVs only do full on/full of though admittedly.

    6 hours ago, Electronics Wizardy said:

    LCDs are also much sharper normally too, I have used many a good crt, and text just isn't as sharp, and with a large display, 4k /6k is a pretty noticable improvement.

    I prefer CRT smoothness to blocky LCD sharpness, but I can see how it could be annoying for some applications. Still, try running an LCD at any resolution besides the native- you won't be impressed with the clarity. I don't doubt that LCDs are better for word processing, but I'd also be curious what "good CRT"s you've used. Good ones can checkerboard without any noticeable lightening of the black levels.

    7 hours ago, Electronics Wizardy said:

    This is still a issue with things like task bars and static elements, and lots of people use their monitors for 8+ hours a day, so a screen saver won' thelp here.

     

    Not really, there is a good reason why everyone has moved on from crts now.

    Color accuracy, response time, motion clarity, and once again: black levels. Someone who writes word documents with their taskbar visible the entire day is probably not going to care about these things, but they're all objectively better on CRT. I'm not saying everyone should go buy a CRT (I already hate how difficult they are to get ahold of) but they produce superior images.

     

    People tend to really hate the idea that they don't have "the best thing", because for some reason everyone wants to think of themselves as 100% pragmatic and objective. I think that's why everyone wants to remember CRTs so unfondly- because we ditched them for a more convenient product which, at the time, was objectively far inferior (seriously, try to use an LCD from 2006). But there's no shame in discarding an inconvenient product for a more convenient one, especially if the pros of the inconvenient product seem insignificant. What I can't get behind is when people ignore the limitations of the technology they use, just because it feels shitty to not have "the best thing". Especially when I hear the shoddy justification "everyone else is using it". Display manufacturers are plenty complacent as it is.

    7 hours ago, Electronics Wizardy said:

    But for your orginal question, people stopped caring about VGA a while ago. Id just get a 980 ti here if I were you, there pretty cheap, and still fast. I don't know of any good VGA converter, and there is no native way to have anouther gpu.

    One guy a few comments up apparently has an adapter with decent bandwidth.

    7 hours ago, Electronics Wizardy said:

    Using a older gpu also has the issue of the software support being dropped for the older gpus, and forcing you on a old driver.

    ...which is exactly why I won't be getting a 980Ti. But your insight is appreciated regardless.

  2. 19 minutes ago, Electronics Wizardy said:

    I mean LCDs are better in almost all ways now. Brighter, better contrast, higher resolution, no flicker, no burnin. We got things like HDR recently as were not limited by the standards made with CRTs in mind.

    IMO this is a complete misconception.

     

    I've seen a lot of "high end" LCDs at this point, and would have agreed with you not-so-long-ago. But I got ahold of my first high end CRT just a few weeks ago, and haven't looked back. The "make better contrast by make brighter backlight" strategy by LCD manufacturers doesn't do it for me. Dark scenes in low light still look like crap on any LCD that doesn't have some crazy number of dimming zones (which cause crazy blooming and often just look like shit). 1440p is more than enough resolution for me at computer monitor sizes, and flicker isn't noticeable after 85hz. The one thing I'll concede to you is burn-in, but how hard is it to just turn on a screen saver? Or just put your computer to sleep if you're not using it?

     

    I don't mean to be inflammatory here, but CRT is just a superior technology for overall picture quality (in dark environments). Size and convenience are obviously massive drawbacks, but people who think consumer-level LCDs are ever gonna come close to "catching up" in picture clearly haven't yet seen a good CRT. And people who have and still prefer their LCD obviously have higher priorities than getting the absolute highest image quality achievable. I'm just a bit of a budget videophile 🙂

  3. 11 minutes ago, manikyath said:

    tbh, for any resolution a CRT could game at, just buy some second hand 980(Ti) and call it a day..

     

    you're gonna lose out on RTX, for the 'several' games that make it a worthwhile investment of 2000 of your dollars...

    The CRT I'm running does better resolutions and framerates than my 2017 gaming laptop's LCD. That gaming laptop has a GTX 1070 in it, which still doesn't run as smoothly as I'd like. Low public opinion of CRTs disappoints me once again 😞 But thanks for the help anyways.

  4. 3 minutes ago, Caroline said:

    What kind of adapter are you using?

     

    I'm rocking a CRT and my adapter tops at 1920*1440 and 97Hz but I set it to 85 for daily use.

     

    Oh, and make sure you're using a 15-pin cable to connect the monitor and adapter. That's important.

    I'm using the official Apple adapter, but didn't read the fine print (it only does 1080p60). All I need is 1600x1200@85hz. Mind linking me to whatever it is you're using? 🙂

  5. 1 minute ago, manikyath said:

    DVI-I (the type of DVI that has the "+" with the 4 pins around it) is basicly also a VGA output, with one of those passive adapter plugs, or just a DVI-I to VGA cable.

     

    and it's either gonna list the compatible modes for your CRT if it is modern enough for edid, or list some default modes.

    either way, on VGA nvidia control panel lets you go WILD with custom resolutions.

    the problem OP is facing, but didnt quite know how to explain, is native VGA output, so he doesnt have to deal with some adapter going funky with odd-ball CRT modes.

    I think I need to get some sleep. Seems like I'm struggling hard to get this point across in writing, lol.

     

    I know that DVI-I is analog, but modern cards don't have DVI-I. Only pre-Pascal stuff does. I'd like to be able to harness the power of newer architecture without losing analog capability, so my question was about having a secondary GPU in the system (one that actually has DVI-I) which should just work as a DAC.

     

    Sorry for my lack of clarity. Going to edit my post now.

  6. I haven't owned a desktop for 3 or 4 years now, but I think I finally hate my laptop enough to make the change. I only ever use the thing at home plugged in to an external display anyways, and it's frankly ugly as all hell, so I'm not gonna be sad to see it go. Hoping to find someone out there who'll swap me for a desktop with comparable specs, but maybe that's wishful thinking.

     

    Anyways, the issue I'm posting about is with my CRT monitor (no, I won't get a new monitor), which importantly only takes analog video in. My DisplayPort->VGA adapter hasn't ever had enough bandwidth for me to simultaneously run the monitor at its max resolution and max refresh simultaneously, even though I know it's capable. But I have heard that one can use an older GPU (With DVI-I) as a sort of passthrough/glorified video DAC.

     

    It seems like the obvious solution for me is to stick an old Maxwell card in the build I'm planning. Problem is that I can't find a tutorial for setting the configuration up anywhere online, but I think it should be relatively similar to what LMG did in their "Nvidia Said We Couldn't Game On This Crypto Mining Card..." video.

     

    I really don't want to have to buy yet another dongly adapter (and risk it also not having enough bandwidth) if this much cleaner solution is an option.

    EDIT: I can't write. To rephrase all this more clearly:

     

    I like my CRT, but it needs native analog video in. Modern cards have jaw-dropping processing capabilities, but no native analog video out. Older cards have native analog video out. I'd like to put one new card and one old card in my upcoming system build, configured so that the older card can work as an overqualified DAC and display output. Videos like this indicate that it should be possible. Anyone know if it is, and if so, how to do it?

     

    Any help or insight would be strongly 💪 appreciated.

  7. 3 minutes ago, 3rrant said:

    Use an active adapter, so you won't lose noticeable quality and can use any modern gpu that you want. And upgrade your TV in the future when you have the chance to

    Any chance you could explain the difference to me between an active and a "standard" adapter? I assume all of them would require external power, right (given that there is a digital->analog conversion occurring)?

  8. 1 minute ago, minibois said:

    DVI for a long time hasn't even been the analog variant. The last couple gens (current one not included) had cards with DVI-D, not DVI-I.

     

    For DVI-I you probably have to go back to R9 200 series (although DVI-I was becoming uncommon there too) and GTX 500 or 600 series if memory serves me right.

     

    Or you have to go even further back for VGA or S-Video support.

    Or you of course can get an adapter.

    Damn, so I really am fucked. Guess I'll just have to live with... latency 😭

  9. Planning on hooking up an analog TV to a sort of gaming/home theater PC. Unfortunately, as far as I can tell, the age of DVI on GPUs is over. Just wondering if there's anything I've missed here, or if my 2 best options really are adapters and used cards :(

  10. 1 minute ago, TVwazhere said:

    If you're not playing at a competitive level it's probably fine. 

     

     

    I'm using a 2S for work and it's honestly a major upgrade from the previous M510, I cant see myself going back to it. The MX Master 3 refines a lot of the 2S, which was a refinement of the original soo... If it aint broke, dont fix it I guess. 

     

    Hadn't seen the master 3 before... beautiful thing it is. think I'll go for that. Thank you.

  11. 4 minutes ago, Streetguru said:

    Thanks for the recommendations! I am particularly interested in finding something wireless a with a really fresh, sophisticated looking aesthetic (to compliment my lack thereof), so I don't think any of these will be for me (unless I decide to just get a new MX Master ;)), but I appreciate it nonetheless ?

  12. I've used the same Logitech MX Master for a little under 3 years now, and its finally conked out. I was a bit of a dick to this mouse and accidentally dropped it many times, so that's a lesson learned. Can't turn the thing on anymore without it constantly sidescrolling which makes me feel very sad and very stupider. I'm sure the market has changed since I bought it tho, so I'd love to hear from y'all what the best replacement would be today. I see a lot of these black plastic "gaming mice" with some big colorful light inside that can't be set to white, but I'm personally not a fan of the aesthetic (just a personal preference; obviously i have no objective issue with it or anyone who uses it). Is there anything similarly or more stylish and minimalistic than the MX master with equally good function/reliability? I'm a big fan of silver if that helps.

    Thanks in advance for the help,

    -Joshua :)

  13. 5 minutes ago, DocSwag said:

    First off, get Ryzen instead. While i5s do have better average frame rates in gaming right now, Ryzen tends to have better minimums, which in my opinion matters more as it gives you a more smooth and consistent experience.

    http://www.gamersnexus.net/hwreviews/2875-amd-r5-1600x-1500x-review-fading-i5-argument/page-4

     

    PCPartPicker part list / Price breakdown by merchant

    CPU: AMD - Ryzen 5 1600 3.2GHz 6-Core Processor  ($199.99 @ SuperBiiz) 
    Motherboard: ASRock - AB350M Micro ATX AM4 Motherboard  ($75.98 @ Newegg) 
    Memory: Team - Dark 16GB (2 x 8GB) DDR4-3000 Memory  ($109.99 @ Newegg) 
    Storage: PNY - CS1311 120GB 2.5" Solid State Drive  ($52.99 @ Amazon) 
    Storage: Western Digital - RE4 1TB 3.5" 7200RPM Internal Hard Drive  ($43.99 @ Amazon) 
    Video Card: Zotac - GeForce GTX 1070 8GB Mini Video Card  ($474.98 @ Newegg) 
    Case: Silverstone - PS08W MicroATX Mid Tower Case  ($39.99 @ Amazon) 
    Power Supply: SeaSonic - G 550W 80+ Gold Certified Semi-Modular ATX Power Supply  ($49.90 @ Newegg) 
    Wireless Network Adapter: Gigabyte - GC-WB867D-I REV 4.2 PCI-Express x1 802.11a/b/g/n/ac Wi-Fi Adapter  ($29.99 @ SuperBiiz) 
    Total: $1077.80
    Prices include shipping, taxes, and discounts when available
    Generated by PCPartPicker 2017-07-04 09:49 EDT-0400

     

    Second of all, at the moment gpu prices are horribly overpriced thanks to miners, 1070s even cost as much as 1080s! Therefore I'd suggest you either wait for prices to die down a bit or buy a used 980 ti (usually you can get one for $300 or so).

     

    I made some changes to the build; I got a 1600 instead and an accompanying mobo and 16gb of high speed ddr4 ram. I also added an ssd in there and got a better psu. You had two wifi cards in there so I got rid of one and replaced the other with a better one (supports ac). I know it's a lot more expensive, but that's because, as I said, gpu prices are way higher right now. 1070s should be $100 less which would put my build at the same price as your original.

    Thanks for this! Will any games take advantage of more than 8GB RAM yet? I've been completely out of the loop in PC building for over a year, so I haven't caught up with the whole Ryzen thing yet. I assume it just takes extra advantage of RAM speed and capacity, making RAM worth more to the build?

  14. So about 6 months ago, I was really active on this forum, but I kinda stopped going here. I didn't remember this forum ever being so insanely flooded with posts, but it doesn't seem like the channel had any ridiculous sub boost to constitute this... Am I just insane?

  15. Just now, mpsparrow said:

    I don't think such a software is out there. Just going to have to use the old looking method. Get your family to help you look.

    I appreciate the help, but I live in a family of people who only do things if there's something in it for them. I'd best start a new thread on bribery tips.

  16. Just now, mpsparrow said:

    What is keeping you from just walking around your house and finding it? In fact if you didn't waste your time making this thread your probably could have already found it.

    My house is fairly big, as I live in a large family, and since the house is being renovated, there are piles of shit all over the place that I can't look through all of, especially if I'm not even sure anything's in there. I went on a trip to sweden one day, and came back the next to find that it was gone, and I've looked quite a bit, so I need to get a lead on where it might be.

×