Jump to content

Acer XB270HU: 1440p 144hz G-Sync. Competition for the Swift.

Gdourado

Of course it will cost some money to implement FreeSync. But the end result is nothing compared to implementation, module, and licensing costs that you have to cover with G-Sync.

Other than being limited to DP only, it's got some huge advantages over G-Sync other than cost.

Not really. FreeSync has no advantage other than being a VESA standard.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Because G-sync uses an overpriced FPGA processor, that couls easily be replaced by a MUCH MUCH cheaper IC chip doing the exact same. Furthermore, it has to have 750MB of monitor ram, and since it's a proprietary Nvidia tech, using only Nvidia's own chip, there is no competition on the g-sync hardware itself. All these things skyrockets the price upwards.

FPGA is usually used for prototypes. I guess it was good Nvidia used this, as G-sync did not work when it was supposed to be launched (the reason it was delayed half a year).

Since Adaptive Sync is a standard, it means several Scaler vendors can (and have) make their own AS scaler IC chip. This is why AS is at least 100$ cheaper than G-sync. Competition between the vendors, and novelty wearing off, will lower the price a lot in the next year too, I assume. After all, these IC should not be costlier in production than the current ones. Only the sunk cost investment will be payed off on the first products.

Is that why G-sync monitors like ASUS ROG, ONLY comes with DisplayPort connectors? DERP!

A newly developed scaler IC will obviously come with a price premium, but again competition between the Scaler vendors, and the payoff of the sunk cost of the development, should lower those prices fairly quickly.

Pretty much. Both rely on the eDP variable Vblank signal. G-sync just does it in a convoluted way, using proprietary tech, using older DP controllers (1.2 not 1.2a or 1.3).

As such "Freesync", or Adaptive Sync as the hardware standard is called, does exactly the same as G-sync, only less complicated, cheaper and as a standard.

Given the demos of FreeSync so far, no one should be impressed. They do not accomplish the same task the same way.

Furthermore, yes, Asus stuck with DP, but GSync works with every connector type, so HDMI 2 and Dual-link DVI will also work. ACER and Samsung both now have monitors will all inputs and GSYNC.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Given the demos of FreeSync so far, no one should be impressed. They do not accomplish the same task the same way.

Furthermore, yes, Asus stuck with DP, but GSync works with every connector type, so HDMI 2 and Dual-link DVI will also work. ACER and Samsung both now have monitors will all inputs and GSYNC.

 

When you say they do not accomplish the same task the same way, do you mean that G-Sync is more effective at eliminating tearing?

Link to comment
Share on other sites

Link to post
Share on other sites

Given the demos of FreeSync so far, no one should be impressed. They do not accomplish the same task the same way.

Furthermore, yes, Asus stuck with DP, but GSync works with every connector type, so HDMI 2 and Dual-link DVI will also work. ACER and Samsung both now have monitors will all inputs and GSYNC.

 

Given that all the freesync demos so far, where proof of concepts, NOT based on the finalized Adaptive Sync standard, and only using existing firmware updated scalers, I'd say those demos prove nothing of the final product.

 

G-sync is DP exclusive. If a monitor has other connectors, those would work with the monitor, just not with support for G-sync. What Samsung monitor has G-sync? What g-sync monitors have other than DP connector?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Not really. FreeSync has no advantage other than being a VESA standard.

I'd rather my GPU not poll and wait on my monitor's response to see if it's ready for another frame (creating overhead and latency). With FreeSync the GPU just sends whatever frames it has ready and the monitor dynamically alters its refresh rate to stay in sync. Another huge advantage is FreeSync's dynamic range is future proof (9-240 Hz) is nearly twice that of G-Sync (30-144 Hz). With FreeSync the monitor only refreshes when there's another frame available. Nvidia took this technology and simply blew it out of proportion making proprietary modules thinking they will own the gaming market with it. When in all reality FreeSync has a way much better implementation.

Link to comment
Share on other sites

Link to post
Share on other sites

All i keep hearing is talk and more talk.
I have nothing against an alternative to G-Sync, but i haven't seen it, neither have you, not even most PC Hardware publications have.

 

He's right. Calculate the bandwidth yourself. DP 1.2 actually craps out at about 112 FPS on 1440p.

This is why Thunderbolt 3.0 is not a bad thing for having a huge bandwidth jump over DP 1.2. (Yes, TB uses the DP display standard, but it has essentially 4 asynchronous copies of that high bandwidth wire inside.

 

I have the Swift, have i not said it yet? It has a sequel of strobelight which only works on certain refresh rates, the highest being 120hz, i can use it just fine, 144hz just fine and i can make a direct comparison by having my VG248QE side by side with the Swift.

Why the fuck are you telling ME that the refresh rate is capped at 112hz when i have the fucking monitor in front of me and have used for 2 months now? You take me for an idiot? You don't even have it and you're talking this nonsense out nowhere?

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

I'd rather my GPU not poll and wait on my monitor's response to see if it's ready for another frame (creating overhead and latency). With FreeSync the GPU just sends whatever frames it has ready and the monitor dynamically alters its refresh rate to stay in sync. Another huge advantage is FreeSync's dynamic range is future proof (9-240 Hz) is nearly twice that of G-Sync (30-144 Hz). With FreeSync the monitor only refreshes when there's another frame available. Nvidia took this technology and simply blew it out of proportion making proprietary modules thinking they will own the gaming market with it. When in all reality FreeSync has a way much better implementation.

Oh please don't make these claims when there isn't a single FreeSync monitor out there for us to test and compare with.

“The mind of the bigot is like the pupil of the eye; the more light you pour upon it the more it will contract” -Oliver Wendell Holmes “If it can be destroyed by the truth, it deserves to be destroyed by the truth.” -Carl Sagan

Link to comment
Share on other sites

Link to post
Share on other sites

GSYNC works with all display inputs whereas FreeSync is DP only.

Second, FreeSync still costs implementation money, even without royalties.

Gsync works with all display inputs? are you serious? :D

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Gsync works with all display inputs? are you serious? :D

I got one in my ear. I just plug my PC into it and I can watch movies in my brain.

Link to comment
Share on other sites

Link to post
Share on other sites

When you say they do not accomplish the same task the same way, do you mean that G-Sync is more effective at eliminating tearing?

No, but the demos seem to show that. The algorithm may be poorly implemented. I say withhold judgment.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I apologize. GSync's future iterations will according to Nvidia.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Given that all the freesync demos so far, where proof of concepts, NOT based on the finalized Adaptive Sync standard, and only using existing firmware updated scalers, I'd say those demos prove nothing of the final product.

G-sync is DP exclusive. If a monitor has other connectors, those would work with the monitor, just not with support for G-sync. What Samsung monitor has G-sync? What g-sync monitors have other than DP connector?

Currently launching. There are links in the Tech News section actually.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

price?

[spoiler= Dream machine (There is also a buildlog)]

Case: Phanteks Enthoo Luxe - CPU: I7 5820k @4.4 ghz 1.225vcore - GPU: 2x Asus GTX 970 Strix edition - Mainboard: Asus X99-S - RAM: HyperX predator 4x4 2133 mhz - HDD: Seagate barracuda 2 TB 7200 rpm - SSD: Samsung 850 EVO 500 GB SSD - PSU: Corsair HX1000i - Case fans: 3x Noctua PPC 140mm - Radiator fans: 3x Noctua PPC 120 mm - CPU cooler: Fractal design Kelvin S36 together with Noctua PPCs - Keyboard: Corsair K70 RGB Cherry gaming keyboard - mouse: Steelseries sensei raw - Headset: Kingston HyperX Cloud Build Log

Link to comment
Share on other sites

Link to post
Share on other sites

When you say they do not accomplish the same task the same way, do you mean that G-Sync is more effective at eliminating tearing?

What he means is G-Sync works in a different way to accomplish the same result. G-Sync has a module built into the display that dispatches and receives messages to and from the GPU. It basically tells the GPU to send more frames if the display's ready or to hold up and wait if it's not. This way the display is getting frames on time with the screens refresh rate. With FreeSync the GPU just spews out all of its ready frames to the display. FreeSync works within the displays firmware and dynamically adjusts the displays refresh rate on the fly to accommodate for the available frames. This not only majorly cuts back on tearing but it should also provide a buttery smooth gaming experience without stutter.

Link to comment
Share on other sites

Link to post
Share on other sites

What he means is G-Sync works in a different way to accomplish the same result. G-Sync has a module built into the display that dispatches and receives messages to and from the GPU. It basically tells the GPU to send more frames if the display's ready or to hold up and wait if it's not. This way the display is getting frames on time with the screens refresh rate. With FreeSync the GPU just spews out all of its ready frames to the display. FreeSync works within the displays firmware and dynamically adjusts the displays refresh rate on the fly to accommodate for the available frames. This not only majorly cuts back on tearing but it should also provide a buttery smooth gaming experience without stutter.

so with these two technologies side by side it would be impossible to tell them apart?

Link to comment
Share on other sites

Link to post
Share on other sites

so with these two technologies side by side it would be impossible to tell them apart?

 

Correct. Although Adaptive Sync (free sync), can carry a much larger framerate interval, than G-sync (G-sync 30-144hz, A-Sync/F-Sync, 9-240hz). So you can use Adaptive Sync for movie playback as well.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Correct. Although Adaptive Sync (free sync), can carry a much larger framerate interval, than G-sync (G-sync 30-144hz, A-Sync/F-Sync, 9-240hz). So you can use Adaptive Sync for movie playback as well.

not really much incentive to go with g-sync at all. what the hell are they thinking?

Link to comment
Share on other sites

Link to post
Share on other sites

not really much incentive to go with g-sync at all. what the hell are they thinking?

 

Well Nvidia always goes the closed proprietary route. The results are always higher consumer prices, less and slower market adaption, and sometimes worse performing as well. However, you have to give credit where it is due. Nvidia did go first to market with a product, and they did initiate all this synced fps/hz business.

 

Now, however, that we have a functioning, license free standard, adopted by a lot of scaler vendors, G-Sync is just redundant, and kind of obsolete too.

 

Problem still stands that Nvidia does not want to support Adaptive Sync, and that no Nvidia graphics cards (including the 900 series), will be able to support it, due to obsolete DisplayPort controllers on the cards.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Keep up the competition! hopefully prices will start to drop on some of these monitors. (What we really need is for Nvidia to hop on adaptive sync train or drop the G-sync license fee).

Link to comment
Share on other sites

Link to post
Share on other sites

From what I read about G-sync, what it does is that it syncs the monitor refresh rate with the GPU on the fly. That is particularly important in games, since the fps (refresh rate of the gpu) on a game engine can have a random nature and shift in real time. One second the game can be running at 130 fps and the second later it can drop to 90, then increase to 125,then drop to 107 and so on. It is random. Depends on a lot of factors on the game itself and the rest of your system.

G-sync keeps the monitor synced with the game refresh rate at all times and in real time. To me that is what g-sync does what why it needs it's hardware module on the monitor with its own processor and memory.

Is freesync comparable? That is the real question. I never saw or heard of a freesync demo where the monitor was synced in real time to a randomly shifting refresh.

Freesync until now only synced GPU with monitors for fixed scenes. Not game engines running in real time.

Link to comment
Share on other sites

Link to post
Share on other sites

From what I read about G-sync, what it does is that it syncs the monitor refresh rate with the GPU on the fly. That is particularly important in games, since the fps (refresh rate of the gpu) on a game engine can have a random nature and shift in real time. One second the game can be running at 130 fps and the second later it can drop to 90, then increase to 125,then drop to 107 and so on. It is random. Depends on a lot of factors on the game itself and the rest of your system.

G-sync keeps the monitor synced with the game refresh rate at all times and in real time. To me that is what g-sync does what why it needs it's hardware module on the monitor with its own processor and memory.

Is freesync comparable? That is the real question. I never saw or heard of a freesync demo where the monitor was synced in real time to a randomly shifting refresh.

Freesync until now only synced GPU with monitors for fixed scenes. Not game engines running in real time.

 

Your understanding of G-sync is correct indeed. But that is exactly what Adaptive sync (freesync) does as well. The demoes so far, has not been with a proper Adaptive Sync scaler IC, but only firmware updated exsisting monitors and scalers. One of their demos could do this at an interval of 40-60hz, proving it worked (proof oc concept). The final version with a proper AS scaler IC, should come out in a month or two, with a hertz interval of 24-60 or 144 (the AS standard in itself supports interval of 9-240hz).

 

The reason for the over expensive G-sync module, is that it is a functioning computer, with an FPGA processor and monitor RAM. FPGA is usually used for prototyping, before making a much cheaper IC chip. A proper IC chip can do the exact same needed in the monitor as the FPGA, and is what Adaptive Sync monitors will have: An IC chip, meaning the monitor will be a lot cheaper (rumour is $100 cheaper).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 months later...

Acer will release a new gaming screen, the 27-Inch XB270HU. It is a 144 HZ TN screen with GSYNC at a nice WHQD resolution of 2560x1440. Fairly similar to the Asus ROG Swift PG278Q. But hey, finally some competition in this range right ?

The specs are fairly similar to the asus one as well, a AU Optronics panel is used here as well. This panel has a 1ms G2G response time, 144Hz refresh rate and support for NVIDIA's G-sync technology. It also supports 3D Vision and NVIDIA's Ultra Low Motion Blur (ULMB) mode which is part of the G-sync feature. In other specs it also offers a 1000:1 contrast ratio, 350 cd/m2 brightness, 170/160 viewing angles, 8-bit colour depth (16.7m colours) and a W-LED backlight offering sRGB colour gamut.

Original article: http://www.guru3d.com/news-story/acer-will-release-27-inch-1440p-tn-film-xb270hu-g-sync-144hz.html

Cheers!

It is not a TN! It is an AHVA IPS panel! GURU3D LIES! NVIDIA IS RIGHT!

Also, this panel DOES NOT support 3D Vision!

Here is a better source: http://www.geforce.com/whats-new/articles/nvidia-g-sync-worlds-first-144hz-ips-monitor-unveiled

Here is a review for this monitor. ALL other sources like Drama stars and Guru3D are wrong! http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

Well it's better then the swift because it's IPS. Also there is a review already out on this monitor on http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

It is not a TN! It is an AHVA IPS panel! GURU3D LIES! NVIDIA IS RIGHT!

Also, this panel DOES NOT support 3D Vision!

Here is a better source: http://www.geforce.com/whats-new/articles/nvidia-g-sync-worlds-first-144hz-ips-monitor-unveiled

Here is a review for this monitor. ALL other sources like Drama stars and Guru3D are wrong! http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm

why are you digging up such an old thread?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×