Jump to content

Ja50n

Member
  • Posts

    217
  • Joined

  • Last visited

Posts posted by Ja50n

  1. My Galaxy S7 was only $260 and gave me a 1440p OLED screen, an awesome camera, and all the performance I need. If I want, I can expand the storage, and it’s still waterproof! I’ve customized my home screen to look unlike anybody else’s phone. Also, it’s got a gold front face, which looks gorgeous.

  2. 5 hours ago, LAwLz said:

    This is what I meant when I said people are just looking at the very basic specs when comparing G-Sync and FreeSync, and ignoring a lot of the functions and features that are not obvious. Those two monitors are not comparable. The G-Sync one is a higher tier monitor.

     

    The G-Sync one has strobing backlight, unlike the Freesync one.

    The FreeSync monitor also lacks adaptive overdrive, so if you want to avoid ghosting you're going to need to manually tune your overdrive settings, preferably for each individual game.

     

    After looking at tests of the two monitors I can also say that:

    The Asus one has better calibration out of the box. For example more accurate colors and grayscale. It's not all that much, but still enough to be noticeable. The G-Sync monitor also has lower input lag (23ms vs 28 ms). 

     

     

    It's true that strobing backlight monitors need extra brightness to compensate for the black intervals. However, this is not really an issue on modern displays (at least high quality ones) because they are fitted with eye-searingly bright backlights to begin with. If you run your current display at 100% brightness, and it's a fairly high end one, then chances are you are getting eye strains and headaches because the display is too bright. It's like shining a flashlight into your eyes.

     

    The strobing backlight can be turned off, so even if there were some disadvantages you could just turn it off if you didn't want it. The point is that you're getting an extra feature that you can use if you want (and it's brilliant for reducing motion blur).

     

     

    I am not saying the FreeSync monitor is bad. It's a good monitor. But the Asus one is in fact a higher tier one. More features and overall a slightly better display in terms of accuracy.

     

    I am getting kind of tired of people doing these types of comparisons when they conclude that "FreeSync is cheaper than G-Sync" and they don't bother looking into more than screen size, resolution and refresh rate. You can't say two monitors are comparable without doing some in-depth reading by people who has done things like measured it with a spectrophotometer or colorimeter. And once you start looking into it a bit more, you will realize that FreeSync monitors are usually cheaper, especially when we're talking about several hundred dollars, because they are just not as good as the G-Sync monitor it's being compared to.

     

    This 200 dollar G-Sync tax myth needs to die.

    I feel you're actually proving my point there, because I'm guessing those extra features on G-sync aren't optional, but required. Therefore someone who's monitor shopping is likely to spend a few hundred more for the Nvidia solution, regardless of whether they're getting more features for their money. It's kind of like justifying RTX cost compared to Vega purely on ray tracing... Yes, RTX cards can do more, but they also cost a lot more even when doing the same thing. Gsync may very well be better, we'll see once comparisons come out! It would still have cost me $600-800, though, when my FreeSync monitor (a feature that made no impact on my purchase, it was just the best deal on the class of monitor at the time) was only $400. Features missing, probably, but I like being able to choose that for myself.

  3. 4 hours ago, LAwLz said:

    Got any links to those monitors you compared, and did you make sure they had the same feature set, such as strobing backlight and adaptive overdrive? 

    https://www.newegg.com/Product/Product.aspx?Item=N82E16824009769

    https://www.newegg.com/Product/Product.aspx?Item=24-236-797

    No clue about extra features, but backlight strobing would cut brightness in half and significantly increase the chances of eye strain and headaches...

  4. 1 hour ago, LAwLz said:

    Higher FPS, but at the expense of the screen. If you want a FreeSync monitor that is comparable to a G-Sync monitor (which is to say, has good variable refresh rate range, adaptive overdrive, light boost and all the other stuff that's standard on G-Sync) then it will not be cheap. It might not even be cheaper than G-Sync. 

     

    I don't think people realize how high quality and all the features that are included in G-Sync. There is way more to it than just resolution and refresh rate (which is what most people look at). 

    I've been looking at monitors, comparing prices on Newegg, and usually when looking at two monitors of similar quality, brand, and ratings, the Gsync was about $200 more. Soon, we'll have third parties comparing these panels on the same card, and we will see a real side-by-side comparison, instead of testing FreeSync+Vega vs Gsync+GTX!

  5. 6 hours ago, BobbyPdue said:

    NVIDIA isn't just taking money from people for nothing there is an additional cost due to having a G-Sync module integrated in the monitor.  That hardware and integration isn't going to be free.

    You’re right, it isn’t, but it means to get Gsync, due to the added module price someone will have to step down from a 2080 to a 2070. Or, they could keep the faster card, get a FreeSync monitor, and have a higher FPS experience.

  6. 26 minutes ago, poochyena said:

    I can too, which is why I created the poll. Clearly 18:9 is NOT the broader audience like Linus seems to have said it was. 100+ people is a half way decent sample size too.

    If 18:9 was actually 50% of the audience or something, then fine, I'd concede... but it very clearly isn't.

    Be careful, as I’d say there’s a lot of bias in your results. The majority of people visiting the forum, finding your post, and then weighing in by voting and commenting are probably very unhappy with the move away from 16:9... it would be interesting to see the results of a Strawpoll during a WAN show for a broader audience polling, though hopefully the results would be similar :)

  7. I can understand wanting to appeal to the broader audience... but... was anyone really asking for the 2:1 aspect ratio? Was anyone on an iPhone X thinking “LTT sucks, I can’t even tell I have a notch when I watch their videos!”? Every time I watch an LTT video now, I’m distracted looking at black bars that shouldn’t be there, bars that pretty much no other YouTube channel has... and I forget to pay attention to the video.

  8. 16 hours ago, mr moose said:

    If they do sound as good as marketed they are not overly expensive for a BT headset with directional control and don't require earbuds or cans, which effectively means you can still hear the car that is about to hit you while listening to music and answering the phone.

    As I said above, if you don';t need earbuds or cans then you have nothing to take out when someone wants to talk to you, you can still be present to your environment whilst enjoying the benefits of a BT headset.  $199 is not that expensive for something that has advantages (even if it doesn't turn out to be audio quality).

    Bone conductive headphones are already available, though, and the best ones sell for under $140... The difference here is Bose is adding “AR”, with their implementation sounding a lot like an audio program on a museum tour.

  9. Confusing marketing and a weird product... what purpose does this serve? Bluetooth headphones and a half decent pair of shades cost less and offer proven utility. I can’t wait for “The sun’s out but my glasses haven’t finished charging!”

  10. 7 hours ago, Drak3 said:

    Except that emulating PowerPC on x86 is and always was trivial. x86 in ARM, on the other hand, is a colossal task to even get working in a half assed manner.

    When has it ever been trivial? My understanding was that PowerPC on x86 was a massive pain with large performance penalties... Just take a look at what Apple managed at the time, where it took a quad core Xeon to beat a single core iMac G5: https://www.macworld.com/article/1053814/rosetta.html

    Windows on ARM meanwhile, is bad but not any worse: https://www.techspot.com/review/1599-windows-on-arm-performance/page2.html

  11. 7 hours ago, Master Disaster said:

    This has been rumoured for years, I'll believe it when I see it.

     

    There's zero chance of Apple ditching X86 entirely, it would cause far to many problems for a company whose raison d'etre is ease of use for customers.

    ARM’s ability to emulate x86 has come a long way, and look at Rosetta, Apple’s switched architectures before, and used an emulation stopgap so everyone could port their code away from PowerPC. So, there’s at least some chance!

  12. Okay, this thing just got WAY less appealing:

     

    Quote

    So the new Palm phone isn't really a Palm phone at all. Heck, it's not even really a phone. Sure, there's a 4G LTE chip in it, but it's no more a phone than the Apple Watch is. It relies on Verizon's NumberShare service to act like a phone when your actual phone isn't around.

    https://www.pcworld.com/article/3313878/android/new-palm-phone-specs-features-price.html

     

×