Jump to content

If AMD gives NVidia users the ability to use BlalckSync, could that make G-Sync crash?

Gaming_Addiction

What is "BlalckSync"?

CPU: i5 3570k CPU Cooler: Hyper 212+ Mobo: Asrock Pro 4 Z77 GPU: Sapphire Tri-X R9 290 PSU: Seasonic G 650w HDD: Seagate Barracuda 1TB SSD: Kingston v300 Sound Card: Asus xonar dg MonitorAcer K272HUL, HP 22bw 

Keyboard: Corsair K70 Mouse: Logitech G400 Mousepad: Steelseries Qck+ Case: Corsair Air 540 Fans: 4x Noctua NF-P12 (saving up for more)

Link to comment
Share on other sites

Link to post
Share on other sites

blasdsfnl scync

Oh, ok

CPU: i5 3570k CPU Cooler: Hyper 212+ Mobo: Asrock Pro 4 Z77 GPU: Sapphire Tri-X R9 290 PSU: Seasonic G 650w HDD: Seagate Barracuda 1TB SSD: Kingston v300 Sound Card: Asus xonar dg MonitorAcer K272HUL, HP 22bw 

Keyboard: Corsair K70 Mouse: Logitech G400 Mousepad: Steelseries Qck+ Case: Corsair Air 540 Fans: 4x Noctua NF-P12 (saving up for more)

Link to comment
Share on other sites

Link to post
Share on other sites

I don't even know what blasdsfnl is...

PC: Corsair C70 Arctic, FX 9370, Corsair H80i, Gigabyte 990fxa-ud3, Corsair Vengence 16gb, Palit JetStream GTX 970, OCZ Vertex 4 128gb and Western Digital Blue 1Tb + 500gb, Antec Gamer 520w

Peripherals: Logitech G19 and SteelSeries Sensei RAW

Toshiba L50-A: i7 4700mq, 8gb, 1TB HDD, GT 740M 2gb

Link to comment
Share on other sites

Link to post
Share on other sites

I thought AMD's implementation was 'free sync'

Linux "nerd".  If I helped you please like my post and maybe add me as a friend :)  ^_^!

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry, but FreeSync is a load of crap IMO. If this technology can do this so well, why didn't NVIDIA take advantage of it first? Why wouldn't NVIDIA save the time and money instead of creating a new standard? The reason? There is something fatally wrong with FreeSync. What this is, I don't know, but there is something wrong.

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry, but FreeSync is a load of crap IMO. If this technology can do this so well, why didn't NVIDIA take advantage of it first? Why wouldn't NVIDIA save the time and money instead of creating a new standard? The reason? There is something fatally wrong with FreeSync. What this is, I don't know, but there is something wrong.

NVidia just wants money, they are not focused on innovation.

Link to comment
Share on other sites

Link to post
Share on other sites

NVidia just wants money, they are not focused on innovation.

If they "just wanted money", they wouldn't have wasted many millions of dollars developing G-Sync. They would have just jumped onto VBLANK before AMD did. Here is a relevant quote from NVIDIA's own Tom Peterson on why FreeSync is questionable as a desktop alternative.

 

 

Laptops have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel. Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel.

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

If they "just wanted money", they wouldn't have wasted many millions of dollars developing G-Sync. They would have just jumped onto VBLANK before AMD did. Here is a relevant quote from NVIDIA's own Tom Peterson on why FreeSync is questionable as a desktop alternative.

AMD wouldnt just claim they have a alternative to it if they had nothing, I highly doupt they develoved it on a laptop so they must have had some test with an actual PC.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD wouldnt just claim they have a alternative to it if they had nothing, I highly doupt they develoved it on a laptop so they must have had some test with an actual PC.

They showed the test at CES on a laptop. Why not a desktop? They claim it's because it can run on even low end hardware, but this lower latency link is what makes me believe otherwise.

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

They showed the test at CES on a laptop. Why not a desktop? They claim it's because it can run on even low end hardware, but this lower latency link is what makes me believe otherwise.

AMD could have their reasoning, maybe they were serious about proving it can run on low end hardware or maybe they werent totaly prepared to show it off completly. G-Sync was announced three months ago and AMD has proven that they want to try and innovate for the future. Within three months they have gotten this far and that must have been a rush. Give AMD a break, they are trying to save you from overpaying for a $600 TN monitors.

 

To me Nvidia must be really butthurt so they also could be making false calls even though they have no access to AMDs software. We have to wait and see what AMD has to say in self defence.

Link to comment
Share on other sites

Link to post
Share on other sites

Nobody will overpay for Nvidias overpriced monitors if they already have the technology. What would happen to G-Sync if AMD does do that?

 

actually people will cause people are stupid and think they will get a premium product

Link to comment
Share on other sites

Link to post
Share on other sites

actually people will cause people are stupid and think they will get a premium product

100% of peopel will just toggle it in bios/in game/open software. If they release it Nvidia could stop producting the monitors becasue they know they will loose money.

Link to comment
Share on other sites

Link to post
Share on other sites

May I just point out that theres only been 2 prices announced, besides the DIY kits which are hard to obtain, which inflates their prices

if Nvidia really cared ONLY about money and not innovation, they wouldnt have made Shield

they way your acting is irrational, yeah Nvidia has a focus on money, but also you cant blame Nvidia for $600 monitors, the fact is Asus and Philips are who chose how much each is worth

also, the asus one is 144 hz  2560x1440
AMD may have something cool, but since its something free, it will probably be good not great, while paying a small premium, which it will become most likely when Gsync becomes more common the price on it will naturally lower

My PC:

Case: Corsair C70, Motherboard: MSI Z87-G45, CPU: I5-4670k, RAM: 16GB Corsair Vengeance, GPU: Gigabyte 970 G1 Gaming , PSU: Corsair RM650W Gold, Storage: 250 GB Samsung EVO SSD, 240 GB Kingston SSDNOW

Link to comment
Share on other sites

Link to post
Share on other sites

I don´t think this "freesync" is the same as G-Sync. we know Nvidia always have been very Premium, but there must be a reason why G-Sync is what it is and not just another Cash cow.

freesync syncs the refresh rate with the FPS, same as G-Sync. Nvidia is too self confident, now they only have TXAA and shaddowplay that is exclusice to them for gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

I'm sorry, but FreeSync is a load of crap IMO. If this technology can do this so well, why didn't NVIDIA take advantage of it first? Why wouldn't NVIDIA save the time and money instead of creating a new standard? The reason? There is something fatally wrong with FreeSync. What this is, I don't know, but there is something wrong.

^^

-The Bellerophon- Obsidian 550D-i5-3570k@4.5Ghz -Asus Sabertooth Z77-16GB Corsair Dominator Platinum 1866Mhz-x2 EVGA GTX 760 Dual FTW 4GB-Creative Sound Blaster XF-i Titanium-OCZ Vertex Plus 120GB-Seagate Barracuda 2TB- https://linustechtips.com/main/topic/60154-the-not-really-a-build-log-build-log/ Twofold http://linustechtips.com/main/topic/121043-twofold-a-dual-itx-system/ How great is EVGA? http://linustechtips.com/main/topic/110662-evga-how-great-are-they/#entry1478299

Link to comment
Share on other sites

Link to post
Share on other sites

Bottom line, it's better for everyone if there is some universal standard capability for gpus to dynamically change the refresh rates of displays.  If they can build that into displayport 1.3 and updated monitors, then that sounds great to me.

I am impelled not to squeak like a grateful and frightened mouse, but to roar...

Link to comment
Share on other sites

Link to post
Share on other sites

May I just point out that theres only been 2 prices announced, besides the DIY kits which are hard to obtain, which inflates their prices

if Nvidia really cared ONLY about money and not innovation, they wouldnt have made Shield

they way your acting is irrational, yeah Nvidia has a focus on money, but also you cant blame Nvidia for $600 monitors, the fact is Asus and Philips are who chose how much each is worth

also, the asus one is 144 hz  2560x1440

AMD may have something cool, but since its something free, it will probably be good not great, while paying a small premium, which it will become most likely when Gsync becomes more common the price on it will naturally lower

Asus' WQHD monitor is priced at like $800 and the VG monitor is somewhere in 500-600. If Nvidia wanted to push innovation then they would not have made PhysX cards exclusive to them and would not have charged that much for a G-Sync PCB. They are doing the opposite from what AMD and is overcharging consumers.

Just becasue it is free doesent mean it is instantly bad at all, is youtube a bad service for being free?

It is no where near to be a "small premium", it is overhyped TN monitors that can be simply unrequired at all. 

 

Sorry but you sound like one of those Nvidia fanboys that blatantly ignore anything good AMD has to offer because you prefer to overpay and like the color green :)

Link to comment
Share on other sites

Link to post
Share on other sites

Bottom line, it's better for everyone if there is some universal standard capability for gpus to dynamically change the refresh rates of displays.  If they can build that into displayport 1.3 and updated monitors, then that sounds great to me.

If it becomes universal couldn't we see higher refresh rates on ips monitors?

Link to comment
Share on other sites

Link to post
Share on other sites

Asus' WQHD monitor is priced at like $800 and the VG monitor is somewhere in 500-600. If Nvidia wanted to push innovation then they would not have made PhysX cards exclusive to them and would not have charged that much for a G-Sync PCB. They are doing the opposite from what AMD and is overcharging consumers.

Just becasue it is free doesent mean it is instantly bad at all, is youtube a bad service for being free?

It is no where near to be a "small premium", it is overhyped TN monitors that can be simply unrequired at all. 

 

Sorry but you sound like one of those Nvidia fanboys that blatantly ignore anything good AMD has to offer because you prefer to overpay and like the color green :)

you really didnt listen to me, Nvidia didnt price the monitors, Asus and Philips did, that was my point, we dont know how much the PCB was sold for, we just know how much it is being sold to us for

 

and No I am not an Nvidia fan boy, I am considering getting an AMD Card because of Mantle, when we hear more about them

also yes youtube is a bad service for being free, because of how it is set up there has been many issues

they do not protect their content creators for one, leaving them out to dry, where a more premium service, may not be as good in some ways, would be able to take care of their people and also perhaps even pay them more, allowing for better production values... like television almost

and just to add to it, Nvidia has always been proprietary, as bad as that may be, it is what it is, and it makes business sense

My PC:

Case: Corsair C70, Motherboard: MSI Z87-G45, CPU: I5-4670k, RAM: 16GB Corsair Vengeance, GPU: Gigabyte 970 G1 Gaming , PSU: Corsair RM650W Gold, Storage: 250 GB Samsung EVO SSD, 240 GB Kingston SSDNOW

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×