Jump to content

Faa

Banned
  • Posts

    5,181
  • Joined

Awards

This user doesn't have any awards

Profile Information

  • Gender
    Not Telling

System

  • CPU
    3930k
  • Motherboard
    Rampage IV Extreme
  • RAM
    Corsair Dominator Platinum 16GB (4x4GB) 1600MHz
  • GPU
    970
  • Case
    Silverstone TJ11
  • Storage
    Samsung 840 Pro 512GB & Samsung 840 Evo 1TB
  • PSU
    Corsair AX1500i
  • Display(s)
    Samsung S27A750D/LG 34UM95
  • Cooling
    Enermax Liqtech 120x
  • Keyboard
    Corsair K90
  • Mouse
    Razer taipan
  • Sound
    junk

Recent Profile Visitors

2,765 profile views
  1. Haha, coming from Freesync all the way to republican governments lol. Sorry dude you're the one who's crying which just tells you're the biased one. Says the guy who got scammed (8320 owner).
  2. Yeah "Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic." That's the Gsync module, better known as Altera Arria V GX. Nvidia has no right to charge a license free as it's not their product at all. "But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module." http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion Stop asking people for a source if you can't even prove your own claims.
  3. Nope, as PCPER stated you need a local frame buffer. AMD would have done it through drivers if it was possible. Yes Gsync is royalty-free too. They can't charge a license fee of someone else hardware.
  4. This has been PR'ed for like almost 2 years orsomething and we haven't seen a single one launching.
  5. Lol 24GB? Specification: - GeForce Titan XXX - GPU: GeForce Titan XXX - Core Base Clock: 1268MHz - Core Boost Clock: 1493MHz - Memory Clock: 8600MHz Samsung GDDR5 - Memory Size: 24576MB GDDR5 - Bus Type: PCI Express 3.0 - Memory Bus: 384-bit - CUDA Cores: 3072 - DirectX 12: Yes - DVI Port: 1x Dual-Link DVI, 3x DisplayPort & 1x HDMI - DisplayPort: Yes - HDCP: Yes - HDMI: Yes - Power: 2x 8-Pin (2x 8-Pin to engage OC mode) - 650W PSU Required - 300W TDP (250W regular use / 300W OC mode) - Dimensions: L=282mm, W=110mm, H=43mm - Warranty: 10yr (BFG Lifetime Warranty (OcUK provides warranty))
  6. You'll get tearing/stutter below the min refresh rate with AS and Nvidia's fix is multiplying the frame rates. VRR works at 1 FPS on Gsync even. Atm that's not possible on AS because the monitor needs a local frame buffer. But you get slight flickering, so the choice is flickering vs stutter. Here you go; http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind Pretty sure WCFF CP'ed everything from PCPER. According to PCPER they found that AS doesn't bring the experience Gsync does.
  7. So is Gsync. Manufacturers are only paying for the module which is around 40-60$. Also driver support + GPU being compatible shouldn't be advertised as license free. There's no such thing as Freesync, another term in the form of Retina. How are you exactly biased if you proved it with a video you can't say no to? They've gone in depth why they prefer Gsync, mainly because of the VRR zone being much better; http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind
  8. But does Mantle solve your performance issues or not?
  9. Thank God you're real. Yeah AMD blaming the scalers they've been designing together with many asic companies such as realtek is stupid. "For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area." Do you have anything better to do than backpedaling after you're being asked to provide a link instead or some PR pictures that prove less than what people managed to prove that God exists to confirm you're real?
  10. You'll only believe something that's coming straight from AMD, if there is any negativity about Intel/Nvidia or positivity about AMD you'll take all sources serious like shit.
  11. "NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates." That's Nvidia's reponse. "For its part, AMD says that ghosting is an issue it is hoping to lessen on FreeSync monitors by helping partners pick the right components (Tcon, scalars, etc.) and to drive a “fast evolution” in this area." AMD's reponse that made no sense. Monitor manufacturers know better than AMD. Gsync doesn't replace the Tcon timer at all, so quite sad.
  12. Nope, you're just making claims you can't prove.
  13. Yeah few things; they don't know what ghosting is or they abuse the term like lag is being used for fps. Only one thread; https://www.google.be/?gws_rd=ssl#q=site:geforce.com+swift+ghosting That's in 3D mode (Gsync doesnt work in 3D) so stop making shit up as you always do. You've gotten a video between two monitors using the same exact panel with Freesync causing Ghosting, that's enough to jump off your zeppelin.
  14. The BenQ XL2730Z & Swift both use the same panel that's made by AUO (BenQ is like the Crucial of Micron or in this case AUO), so yes it does.
  15. That won't work at all, you have to manipulate the Vblank interval to have VRR. If you're just mindlessly dumping all of your frames to your monitor you rather create more input lag as the display has more frames to process. Even your understanding how AS works is flawed. Dynamically? It's variable. The GPU is not sending mindlessly frames to the monitor, that's not the purpose of a sync.
×