Jump to content

FreeSync 2 with Nvidia graphics card?

althsus
Go to solution Solved by Jurrunio,
3 hours ago, althsus said:

Oh wow. I would have thought atleast HDR would also work with a Nvidia card as technically the monitor is HDR certified. Guess I will have to go with an AMD card then. Also would you happen to know if HDR reduces fps performance in games?

oh wait it does, HDR is considered another open standard. Freesync 2 does include auto switching between HDR and SDR for example (and other stuff) which doesnt work when paired with an Nvidia GPU (at least from what I have read)

 

basically, you lose the minor stuff but keep the major features.

If I were to use a Nvidia graphics card on a monitor with FreeSync 2 and HDR certification. Would I lose out on a lot of features the monitor provides? As I know Nvidia is now FreeSync compatible or FreeSync monitors are G Sync compatible. Would I still be able to use HDR and other features the FreeSync 2 brings?

Link to comment
Share on other sites

Link to post
Share on other sites

No.

 

Technically Nvidia cards dont support Freesync either, what they added support to is Adaptive Sync, an open standard by VESA which just so happens was partially ported from Freesync back then. As a result, you cant use the entire Freesync feature set with Nvidia GPUs that support Adaptive Sync by VESA, say variable refresh rate through HDMI. It's DisplayPort only for Nvidia, while AMD cards can use both DP and HDMI.

 

I hope that explains why Freesync 2 features arent supported on Geforce cards. Both companies still stick to their own paths here, with Nvidia pushing Gsync Ultimate against Freesync 2 (or maybe the other way around).

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Jurrunio said:

No.

 

Technically Nvidia cards dont support Freesync either, what they added support to is Adaptive Sync, an open standard by VESA which just so happens was partially ported from Freesync back then. As a result, you cant use the entire Freesync feature set with Nvidia GPUs that support Adaptive Sync by VESA, say variable refresh rate through HDMI. It's DisplayPort only for Nvidia, while AMD cards can use both DP and HDMI.

 

I hope that explains why Freesync 2 features arent supported on Geforce cards. Both companies still stick to their own paths here, with Nvidia pushing Gsync Ultimate against Freesync 2 (or maybe the other way around).

Will I be losing any of the real life important feature such as HDR if I choose to use a Nvidia card with my FreeSync 2 monitor? I'm assuming G Sync or that adaptive sync will work just fine.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, althsus said:

Will I be losing any of the real life important feature such as HDR if I choose to use a Nvidia card with my FreeSync 2 monitor? I'm assuming G Sync or that adaptive sync will work just fine.

Yes, all of the extra stuff. Gsync Ultimate has its own HDR implementation (was called Gsync HDR in the past) and Nvidia wants you to stick with that. You only get variable refresh rate.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Jurrunio said:

Yes, all of the extra stuff. Gsync Ultimate has its own HDR implementation (was called Gsync HDR in the past) and Nvidia wants you to stick with that. You only get variable refresh rate.

Oh wow. I would have thought atleast HDR would also work with a Nvidia card as technically the monitor is HDR certified. Guess I will have to go with an AMD card then. Also would you happen to know if HDR reduces fps performance in games?

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, althsus said:

Oh wow. I would have thought atleast HDR would also work with a Nvidia card as technically the monitor is HDR certified. Guess I will have to go with an AMD card then. Also would you happen to know if HDR reduces fps performance in games?

oh wait it does, HDR is considered another open standard. Freesync 2 does include auto switching between HDR and SDR for example (and other stuff) which doesnt work when paired with an Nvidia GPU (at least from what I have read)

 

basically, you lose the minor stuff but keep the major features.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×