Jump to content

JT_NC

Member
  • Posts

    16
  • Joined

  • Last visited

Awards

This user doesn't have any awards

  1. Great. So you can point me to the business study they conducted that outlines why it would be a bad idea (aka not profitable), rather than just your opinion? I came here because I really enjoy Linus Tech Tips and thought its forums would have some tech-savvy individuals that I could have some good discussions with. I didn't come here expecting to be told to GTFO. Thanks for setting me straight.
  2. Sheesh. I'm not sure if you're aware of this, but you have a really condescending attitude. Your opinion isn't the only one that matters. "Because they don't want to" may be a good enough reason for you. You don't even know that for sure. You're guessing, and haven't provided any solid facts to support your statements. So why don't they want to? Do they believe that their higher prices and equal performance will gain market share? You say I'll never know the market as well as they do. But you do? Do you or someone you know work for Nvidia and has first-hand experience that that's their reason? I'm not an Nvidia hater. They still clearly have the best performing GPUs in the market. I just looked up the numbers, and in the past 10 years I've bought 10 different Nvidia GPUs (from the 8600 GTS to a pair of 1080 TIs this year) , and 12 different AMD GPUs (from the X1600XT to the RX 480). That's a pretty even spread. But Freesync monitors outsell G-sync monitors significantly, due to a combination of excellent performance and price. If G-sync was significantly better than Freesync then this wouldn't be an issue by any means, but it's not. G-sync is just a proprietary chipset that accomplishes exactly the same thing as Freesync, yet those monitors cost significantly more. (lots of discussion on this on the first page of the thread) Having dropped over $2K on video cards this year alone, I'm not trying to be stingy here. I'm just honestly curious why it hasn't happened yet. I don't want you to explain 'basic business practices 101'. I'm probably quite a bit older than you and have more experience in that department anyway, at least based on the tone of your posts. So you couldn't provide a compelling or logical answer to my original question. That's fine. Don't worry about it. Other contributors have tossed out some good thoughts on the topic, so maybe you can learn something from those posts if you won't listen to mine. You don't have to try to make someone feel bad just because you don't know an answer to the question they asked. If not knowing the answer makes you feel bad, and because of that you try to make them feel worse than you do, and that makes you feel better, well that's just not cool. Enjoy the weekend. Get some fresh air.
  3. When people know their Freesync monitor will work with an Nvidia GPU, they may choose an Nvidia graphics card over AMD. Nvidia makes money. If people could subscribe for a service that gives them the option to enable the GPU to work with Freesync, people would pay for it. Nvidia makes money. If Nvidia decided to make a converter card or box that they could sell (probably for a couple hundred bucks even), Nvidia could make money, and if they had their partners build those, they wouldn't be pissing them off either because their partners would also make money. If someone had an AMD card but a G-sync monitor, Nvidia could have a converter for them too. Nvidia makes money. If Nvidia adopted some of these ideas they could shed their somewhat stuck up attitude and unwillingness to play nice with others, and possibly even lower some of their prices, they could actually increase their market share. Nvidia makes money. You wanted one reason. There are several. Nvidia could make money. Nvidia's partners could make money. Consumers would also be happy. Instead, we've got nothing. Once again, the question "why?".
  4. If you go back to my very first post, I clearly pointed out in it that this is something Nvidia would need to be involved with. I'm well aware of the legal ramifications. It's just very surprising to me, that in this day and age, there's not a bridge between the two technologies. Hence the subject line...
  5. @Vantage9 Well, the way I see it... Nvidia is simply saying "I don't want your GI Joe's sitting in my Millenium Falcon. I don't care that your action figures are exactly the perfect size. This toy is mine, and I don't want you playing with it." It's very immature. In most cases like this, a compromise or workaround is found. You can convert micro SD to mini SD to SD to CompactFlash. You can take all the USB converters you want and string them together to make your devices work. You can convert from DisplayPort to HDMI to VGA. You can run different operating systems in emulators. Someone usually finds a way. Originally, motherboards were strictly Crossfire or SLI. But they got over that and now the same boards can support both. There's a market for a G-sync to Freesync converter. No one's built one yet. Nvidia could easily do it if they wanted to (or "flip the switch" if that would work), and they'd make money from it. That's why I think there should be one. Or, they could always compete on price, since the markup for G-sync is pretty ridiculous.
  6. Thanks @-rascal-, I guess I should've read more of the text on that comparison page. 10% fewer cores on the 3GB version, which explains the 10% lower scores. I thought they were the same otherwise, like the RX 480/580 4GB vs 8GB, where the higher VRAM was pretty much just better for higher resolutions. So yeah, VRAM doesn't do a lot to affect performance, assuming you have enough for the resolution you want to play at. 2GB is still on the low side though.
  7. Here are a couple of comparisons. For the 1060, 3GB vs 6GB has a pretty big performance impact (an increase of 10%-40% fps): http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1060-3GB-vs-Nvidia-GTX-1060-6GB/3646vs3639 And here's the 1060 6GB compared to the 1050 Ti 4GB: http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1050-Ti-vs-Nvidia-GTX-1060-6GB/3649vs3639 The 1060 blows it out of the water, with 80-90% improved synthetics, with consistently 40%-70% more fps. In a lot of games, you won't get over 45 fps at 1080p w/ max settings using the 1050 Ti. In this case, you really do get what you pay for.
  8. My main point was that the same rules apply to evolving both technologies. In the end, from a monitor's concept to retail delivery they have the same associated costs, so to claim that a monitor that implements G-sync *should* cost significantly more than a monitor that implements Freesync just doesn't hold any water. It's not a unique situation, as Intel certainly charges more than they need to, and so does Apple. It's just a result of living in a free market society. But it's also very annoying, and there should be a workaround by now.
  9. And AMD worked on Freesync 2, yet those monitor prices still remain a lot cheaper. There's just no justification for the price disparity. It's simply Nvidia doing it because they can. http://www.anandtech.com/show/10967/amd-announces-freesync-2-improving-ease-lowering-latency-of-hdr-gaming
  10. I've got a Ryzen 5 1500X w/ 2 480s. (moved one over when I got my 1080 Ti for my main PC) Together, 3DMark Time Spy scored just high enough that it fit in the 4K Gaming PC category. The Ryzen 3 definitely wouldn't get rated that high, and I suspect it would indeed be a bottleneck. As @PCGuy_5960 wrote, that i7-7700K build would give you fantastic gaming performance. If you don't already have one, I'd recommend an M.2 NVMe boot drive though.
  11. The G-sync controller was designed enough years ago that they shouldn't still be paying for the R&D. That technology doesn't need to be redeveloped for every new monitor design. All they need to design on the per-monitor basis is the layout of the components and shape of the circuit board (which is more likely dictated by the monitor manufacturer rather than Nvidia anyway), and that shouldn't jack up the price so much considering the process has to be done regardless of whether it's G-sync or Freesync, yet Freesync monitors cost significantly less. That price disparity is simply Nvidia's version of the "Apple tax". If people are willing to pay it, they'll happily keep overcharging for it.
  12. Given the number of Freesync monitors out there, Nvidia could sell even more cards if they enabled theirs to work with both types of monitor, now that AMD has a *cough* competitor *cough*, at least once there's not a complete shortage of cards *cough cough*. I should get that checked out. If it's simply a "software switch", I'm really surprised no one's hacked the drivers. You know the difference between an Amazon Kindle without Ads and an Amazon Kindle with Ads? You pay more for the version without. Otherwise they're the same. It's just software. Nvidia could come up with some "premium membership" program. For $X (or $X/yr), members could have access to drivers that enable Freesync. Maybe toss in a game every month so people without Freesync monitors want to subscribe too. Problem solved. Win-win for everyone. (except those who don't participate, who won't be any worse off than they are today)
  13. But then they couldn't justify adding $300 onto the monitor's price for an extra $20-30 worth of silicon.
  14. For a 1080p monitor, the 1060 can handle all of those no problem. (Haven't played PUBG yet though.) Above 1080p you'd probably want the 1070 or lower the settings a little. The prices have really jumped on those lately, so you might want to wait and observe the market for a bit. I grabbed a new 1060 6GB for a build about 6 months ago for $260. (was competing against the 580 in price) The same card is $350 now. I've got it hooked to a 2560x1440 monitor and it can dip below 60fps, but not by too much. Note that you'll want the 6GB version. The 3GB has noticeably lower benchmarks (much greater than the disparity between 4GB and 8GB AMD cards).
  15. Well, even if Nvidia doesn't want to do it themselves, I understand there's a certain country on this planet of ours where patents and copyrights don't hold much (if any) sway, which in turn provides us with all sorts of cheap knockoffs and useful gadgets that we might not have otherwise. With that being the case, it has me wondering if the reason why we haven't seen something turn up yet might be due to a technical reason rather than a business one. When unofficial (and possibly illegal) tech gains enough traction, a lot of companies tend to adopt it into their portfolio. That hasn't happened in this case. I tend to adopt the "why can't we all just get along" philosophy. Given that Nvidia recently made a tweet about AMD's CPUs working great with Nvidia's GPUs, maybe there's hope for something official... someday. Thanks for the replies so far everyone.
×