Jump to content

Faa

Banned
  • Posts

    5,181
  • Joined

Everything posted by Faa

  1. Haha, coming from Freesync all the way to republican governments lol. Sorry dude you're the one who's crying which just tells you're the biased one. Says the guy who got scammed (8320 owner).
  2. Yeah "Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic." That's the Gsync module, better known as Altera Arria V GX. Nvidia has no right to charge a license free as it's not their product at all. "But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module." http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion Stop asking people for a source if you can't even prove your own claims.
  3. Nope, as PCPER stated you need a local frame buffer. AMD would have done it through drivers if it was possible. Yes Gsync is royalty-free too. They can't charge a license fee of someone else hardware.
  4. This has been PR'ed for like almost 2 years orsomething and we haven't seen a single one launching.
  5. Lol 24GB? Specification: - GeForce Titan XXX - GPU: GeForce Titan XXX - Core Base Clock: 1268MHz - Core Boost Clock: 1493MHz - Memory Clock: 8600MHz Samsung GDDR5 - Memory Size: 24576MB GDDR5 - Bus Type: PCI Express 3.0 - Memory Bus: 384-bit - CUDA Cores: 3072 - DirectX 12: Yes - DVI Port: 1x Dual-Link DVI, 3x DisplayPort & 1x HDMI - DisplayPort: Yes - HDCP: Yes - HDMI: Yes - Power: 2x 8-Pin (2x 8-Pin to engage OC mode) - 650W PSU Required - 300W TDP (250W regular use / 300W OC mode) - Dimensions: L=282mm, W=110mm, H=43mm - Warranty: 10yr (BFG Lifetime Warranty (OcUK provides warranty))
  6. You'll get tearing/stutter below the min refresh rate with AS and Nvidia's fix is multiplying the frame rates. VRR works at 1 FPS on Gsync even. Atm that's not possible on AS because the monitor needs a local frame buffer. But you get slight flickering, so the choice is flickering vs stutter. Here you go; http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind Pretty sure WCFF CP'ed everything from PCPER. According to PCPER they found that AS doesn't bring the experience Gsync does.
  7. So is Gsync. Manufacturers are only paying for the module which is around 40-60$. Also driver support + GPU being compatible shouldn't be advertised as license free. There's no such thing as Freesync, another term in the form of Retina. How are you exactly biased if you proved it with a video you can't say no to? They've gone in depth why they prefer Gsync, mainly because of the VRR zone being much better; http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind
  8. But does Mantle solve your performance issues or not?
  9. Thank God you're real. Yeah AMD blaming the scalers they've been designing together with many asic companies such as realtek is stupid. "For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area." Do you have anything better to do than backpedaling after you're being asked to provide a link instead or some PR pictures that prove less than what people managed to prove that God exists to confirm you're real?
  10. You'll only believe something that's coming straight from AMD, if there is any negativity about Intel/Nvidia or positivity about AMD you'll take all sources serious like shit.
  11. "NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates." That's Nvidia's reponse. "For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area." AMD's reponse that made no sense. Monitor manufacturers know better than AMD. Gsync doesn't replace the Tcon timer at all, so quite sad.
  12. Nope, you're just making claims you can't prove.
  13. Yeah few things; they don't know what ghosting is or they abuse the term like lag is being used for fps. Only one thread; https://www.google.be/?gws_rd=ssl#q=site:geforce.com+swift+ghosting That's in 3D mode (Gsync doesnt work in 3D) so stop making shit up as you always do. You've gotten a video between two monitors using the same exact panel with Freesync causing Ghosting, that's enough to jump off your zeppelin.
  14. The BenQ XL2730Z & Swift both use the same panel that's made by AUO (BenQ is like the Crucial of Micron or in this case AUO), so yes it does.
  15. That won't work at all, you have to manipulate the Vblank interval to have VRR. If you're just mindlessly dumping all of your frames to your monitor you rather create more input lag as the display has more frames to process. Even your understanding how AS works is flawed. Dynamically? It's variable. The GPU is not sending mindlessly frames to the monitor, that's not the purpose of a sync.
  16. 9-240Hz, all I'm seeing is 30-144Hz. Proprietary module? Uhm it's an Altera Arria V GX which everyone can buy. Performance penalty of below 5% is within margin of error, run the test 100 times the results will always vary especially in games that do not even have an in-game benchmark run. Explain the ghosting then; Also your explanation is completely wrong, when the monitor is done drawing the current frame it will wait for the GPU to deliver the next frame (if not it will draw the last frame again). The delay is controlled with Vblank interval which is the same technique Freesync applies. So both will introduce some lag as the technique is the same. The GPU will just poll the monitor to see if the monitor is in the vblank state or not so you don't end up with bad scans which freesync does as well. It's a oneway system.
  17. The CPU doesn't need to be at 100% usage to form a bottleneck, like I said it's a useless indicator. Even if your CPU is at 100%, you still can have 99% usage on your GPU constantly so there's effectively no bottleneck. Basically the CPU just tells the GPU what to do, if the CPU doesn't do that fast enough your GPU will be bottlenecked by the CPU. Your indicator is your GPU usage to monitor this, that's all there is to it.
  18. CPU bottleneck; if the GPU operates at loads below 99% usage eg 50% (assuming vsync is off) GPU bottleneck; you always want the GPU at maximum load for maximum performance - if it is and you're not happy with its performance reduce settings or upgrade That's all about it. Don't bother monitoring the CPU usages, it's useless.
  19. Still doesn't prove Zen won't have AVX512. Thats what Opcode has been asked to do.
  20. And the best part is that the GPU is locked to around 80° by default. Raise that to 95° I'm sure it will hit 90°. And ofc Nvidia doesn't get any bashing. There won't be any aftermarket coolers for this one, Nvidia doesn't allow it >.<
  21. Nothing hence why it's stating consumer grade cpu's won't have avx512. Just like others said, lots of software aren't even using avx at all.
  22. Faa

    Selling this system tomorrow. New build next mo…

    What are you getting instead?
  23. Which ones? So far all I can find is the Sandisk one (which is a lot slower) & this one, the samsung one is faster but doesn't have nvme support http://www.anandtech.com/show/8865/samsung-launches-sm951-m2-pcie-30-x4-ssd-for-oemssis
  24. Just to let you know the i7 is only 20-30% faster in video editing, if you only edit a video once a week bring the CPU down to a 4690K. Don't get a H100i unless it's for aesthetics, those lga1150 don't kick enough heat out to leverage the cooling performance of such a cooler. A 750/850W PSU is useless, all you need is a 450W thing (yes even for overclocking the GPU/CPU).
  25. Ah so you firstly said it can't stream, later on it won't work very well, but it's your dads PC. Pretty sure you haven't tried streaming on your dads PC out or you wouldn't be changing your claims right now. Multi means two or more, so a dual core is a multi-core processor. The 6300 is not that much of a powerhog next to a i3; http://www.anandtech.com/show/8427/amd-fx-8370e-cpu-review-vishera-95w/2
×