Jump to content

Nvidia slams AMD FreeSync: "We can't comment on pricing of products that don't exist"

Faa

Given what the freesync solution is on paper, it can't be as good. G-sync tells the monitor when to draw. Free-sync stops the GPU from calculating the next frame. Unless there is much more to the story, freesync is going to fall very flat.

 

I'm not going to argue about the technology here, but only say you missed the point entirely.

Tea, Metal, and poorly written code.

Link to comment
Share on other sites

Link to post
Share on other sites

I would hardly call that a "slam". Simple fact is AMD havn't got anything out yet for Freesync, so of course they cannot gauge their prices. All Nvidia know is they have to make as much cash as they can before AMD start putting out Freesync monitors. After AMD release theirs, Nvidia will lower their prices to compete. 'Tis a simple thing called business after all.

Yeah I more or less agree with you here.

 

Obviously, this is a "No Duh" or "No Shit" moment here. Freesync is still in development, and G-Sync has a retail - ready to buy - monitor available right this moment.

 

With that in mind, this is obviously NVidia PR bullshit. They can probably guess pretty damn accurately what the Freesync monitors will cost. Hell they probably know exactly how AMD designed the damn things.

 

As to everyone saying "Insert solution" is the best - well you are just speculating and you know it. We won't know how good Freesync is until it's released and can be compared. Honestly I have faith that it will match or come close to G-Sync at a lower price point. That might force NVidia to lower their prices when Freesync launches - or they might just keep their premium price tag. Honestly I believe it'll depend on how well Freesync competes.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Free-Sync was and is an attempt by AMD to steal some of Nvidia's thunder. They don't have anything to show for themselves.

 

Seriously? Or just trolling? ^-^

It is said, that civilised man seeks out good and intelligent company, so that through learned discourse he may rise above the savage and closer to God. Personally, however, I like to start the day with a total dickhead to remind me I'm best! 

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah I more or less agree with you here.

 

Obviously, this is a "No Duh" or "No Shit" moment here. Freesync is still in development, and G-Sync has a retail - ready to buy - monitor available right this moment.

 

With that in mind, this is obviously NVidia PR bullshit. They can probably guess pretty damn accurately what the Freesync monitors will cost. Hell they probably know exactly how AMD designed the damn things.

 

As to everyone saying "Insert solution" is the best - well you are just speculating and you know it. We won't know how good Freesync is until it's released and can be compared. Honestly I have faith that it will match or come close to G-Sync at a lower price point. That might force NVidia to lower their prices when Freesync launches - or they might just keep their premium price tag. Honestly I believe it'll depend on how well Freesync competes.

Dude NVIDIA publicly said :

 

Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When this is completly bullshit... what did you expect?

Link to comment
Share on other sites

Link to post
Share on other sites

ugh, these posts that do nothing but invite fanboyism nerdrage arguements need to stop

Processor: AMD FX8320 Cooler: Hyper 212 EVO Motherboard: Asus M5A99FX PRO 2.0 RAM: Corsair Vengeance 2x4GB 1600Mhz

Graphics: Zotac GTX 1060 6GB PSU: Corsair AX860 Case: Corsair Carbine 500R Drives: 500GB Samsung 840 EVO SSD & Seagate 1TB 7200rpm HDD

 

Link to comment
Share on other sites

Link to post
Share on other sites

 

That Acer 4k2k XB280HK looks really tempting, but something about it being an Acer is really off putting. 

CPU: AMD Ryzen 9 3900x  GPU: ASUS Strix rtx 2080 Super RAM: Corsair Vengeance Pro RGB 8gb x4 PSU: Corsair HX850i Motherboard: ASUS Strix x570-E Storage: Samsung 840 pro, Samsung 970 evo 1tb nvme, segate 2tb Case: NZXT H510I Cooling: Corsair h100i

Link to comment
Share on other sites

Link to post
Share on other sites

That Acer 4k2k XB280HK looks really tempting, but something about it being an Acer is really off putting. 

i'm probably going to go for the AOC one,3 years warranty and its pretty cheap,hard to beat that.

Link to comment
Share on other sites

Link to post
Share on other sites

Given what the freesync solution is on paper, it can't be as good. G-sync tells the monitor when to draw. Free-sync stops the GPU from calculating the next frame. Unless there is much more to the story, freesync is going to fall very flat.

Actually it's the other way around. G-sync uses a complex 2 way system, where the gfx constantly spams the monitor, if it's ready for a frame. When the monitor says yes, it is sent a frame.  

Adaptive Sync on the other hand, uses plug and play, to tell the gfx upon connection, what the supported min/max hz are for the monitor. The gfx will then send frames within that interval, and the monitor will display it instantly.

Both G-sync and Adaptive sync will give stutter if they go below the supported min hz of the monitor.

So Adaptive Sync is simpler, easier and actually better (remember Adaptive Sync supports 9-240hz in the standard, gsync only 30-144hz).

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Please give me a couple of gsync monitor links for purchase.

T

R

O

L

L

Owner of a top of the line 13" MacBook Pro with Retina Display (Dual Boot OS X El Capitan & Win 10):
Core i7-4558U @ 3.2GHz II Intel Iris @ 1200MHz II 1TB Apple/Samsung SSD II 16 GB RAM @ 1600MHz

Link to comment
Share on other sites

Link to post
Share on other sites

T

R

O

L

L

Please don't accuse other members of being trolls. If you suspect troll activity, please just report them to Admins/Mods, and keep the personal attacks/insults off the forums.

 

 

Actually it's the other way around. G-sync uses a complex 2 way system, where the gfx constantly spams the monitor, if it's ready for a frame. When the monitor says yes, it is sent a frame.  

Adaptive Sync on the other hand, uses plug and play, to tell the gfx upon connection, what the supported min/max hz are for the monitor. The gfx will then send frames within that interval, and the monitor will display it instantly.

Both G-sync and Adaptive sync will give stutter if they go below the supported min hz of the monitor.

So Adaptive Sync is simpler, easier and actually better (remember Adaptive Sync supports 9-240hz in the standard, gsync only 30-144hz).

I agree with your assessment, based on what we've heard. However, we do need to keep an open mind on this. Freesync could be just as good as G-Sync. Or it could be better. Or it could be worse. On paper, it looks totally kickass, but let's reserve final judgment until Freesync monitors are out and benchmarked.

 

I can't WAIT until we can see a head-to-head battle between the technologies. Personally, I think that G-Sync's days are limited, unless it vastly outperforms Freesync. The reason I think this is because AMD is working with VESA to put Adaptive-Sync (The technology on the monitor side of things) into the official DP spec. NVidia will then be able to use Adaptive-Sync with their own cards. They could still use the G-Sync branding, but they wouldn't need the G-Sync module anymore. AMD specifically made sure that the technology was open so that Freesync wasn't just another competing proprietary standard.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

You can't use Lightboost at the same time as G-sync anyway.

 

I know, but for a 144Hz panel that supports 3D, it should have been offered for when G-sync is not enabled.

Ryze of the Phoenix: 
CPU:      AMD Ryzen 5 3600 @ 4.15GHz
Ram:      64GB Corsair Vengeance LPX DDR4 @ 3200Mhz (Samsung B-Die & Nanya Technology)
GPU:      MSI RTX 3060 12GB Aero ITX
Storage: Crucial P3 1TB NVMe Gen 4 SSD, 1TB Crucial MX500, Spinning Rust (7TB Internal, 16TB External - All in-use),
PSU:      Cooler Master MWE Gold 750w V2 PSU (Thanks LTT PSU Tier List)
Cooler:   BeQuite! Prue Rock 2 Black Edition
Case:     ThermalTake Versa J22 TG

Passmark 10 Score: 6096.4         CPU-z Score: 4189 MT         Unigine Valley (DX11 @1080p Ultra): 5145         CryEngine Neon Noir (1080p Ultra): 9579

Audio Setup:                  Scarlett 2i2, AudioTechnica AT2020 XLR, Mackie CR3 Monitors, Sennheiser HD559 headphones, HyperX Cloud II Headset, KZ ES4 IEM (Cyan)

Laptop:                            MacBook Pro 2017 (Intel i5 7360U, 8GB DDR3, 128GB SSD, 2x Thunderbolt 3 Ports - No Touch Bar) Catalina & Boot Camp Win10 Pro

Primary Phone:               Xiaomi Mi 11T Pro 5G 256GB (Snapdragon 888)

Link to comment
Share on other sites

Link to post
Share on other sites

Actually it's the other way around. G-sync uses a complex 2 way system, where the gfx constantly spams the monitor, if it's ready for a frame. When the monitor says yes, it is sent a frame.  

It's polling the GPU to see if the monitor is in vblank state or not which causes a slight effect on your framerate. If it's not, you get a new frame on your display. When the monitor is done drawing the frame, it waits for the GPU to render the next frame so it's not spamming it. 

 

The gfx will then send frames within that interval, and the monitor will display it instantly.

There's a difference between changing the static refresh rate and the frame refresh interval on the fly. Atm AC is static as you've been proved real-time twice.

Gsync works by modifying the vblank interval, using it as a delay between the current & next frame so you don't end up with stutter or whatever benefits it has but you do get a slightly higher input lag. All LCD panels have vblank interval, would be pointless trying to send frames during that with a new technology. Adaptive Sync modifies Vblank as well, using it as a delay.

 

 

So Adaptive Sync is simpler, easier and actually better (remember Adaptive Sync supports 9-240hz in the standard, gsync only 30-144hz).

 

Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz. http://www.forbes.com/sites/jasonevangelho/2014/05/12/amds-project-freesync-gets-momentum-as-adaptive-sync-gets-added-to-displayport-spec/

You can't really defend PR lies from a non-existing product and tell the lies are better than an existing product.

Link to comment
Share on other sites

Link to post
Share on other sites

It's polling the GPU to see if the monitor is in vblank state or not which causes a slight effect on your framerate. If it's not, you get a new frame on your display. When the monitor is done drawing the frame, it waits for the GPU to render the next frame so it's not spamming it. 

 

There's a difference between changing the static refresh rate and the frame refresh interval on the fly. Atm AC is static as you've been proved real-time twice.

Gsync works by modifying the vblank interval, using it as a delay between the current & next frame so you don't end up with stutter or whatever benefits it has but you do get a slightly higher input lag. All LCD panels have vblank interval, would be pointless trying to send frames during that with a new technology. Adaptive Sync modifies Vblank as well, using it as a delay.

 

 

 

Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz. http://www.forbes.com/sites/jasonevangelho/2014/05/12/amds-project-freesync-gets-momentum-as-adaptive-sync-gets-added-to-displayport-spec/

You can't really defend PR lies from a non-existing product and tell the lies are better than an existing product.

Just like you can't claim they are lies from a non-existing product.

 

We just don't know how well Freesync will stack up. AMD isn't telling any lies that I've seen. If you have, please post them (as that would surely be breaking news). Once we have an actual demo product to check out from Freesync, then we'll be able to determine how factual AMD was being.

For Sale: Meraki Bundle

 

iPhone Xr 128 GB Product Red - HP Spectre x360 13" (i5 - 8 GB RAM - 256 GB SSD) - HP ZBook 15v G5 15" (i7-8850H - 16 GB RAM - 512 GB SSD - NVIDIA Quadro P600)

 

Link to comment
Share on other sites

Link to post
Share on other sites

I agree with your assessment, based on what we've heard. However, we do need to keep an open mind on this. Freesync could be just as good as G-Sync. Or it could be better. Or it could be worse. On paper, it looks totally kickass, but let's reserve final judgment until Freesync monitors are out and benchmarked.

 

I can't WAIT until we can see a head-to-head battle between the technologies. Personally, I think that G-Sync's days are limited, unless it vastly outperforms Freesync. The reason I think this is because AMD is working with VESA to put Adaptive-Sync (The technology on the monitor side of things) into the official DP spec. NVidia will then be able to use Adaptive-Sync with their own cards. They could still use the G-Sync branding, but they wouldn't need the G-Sync module anymore. AMD specifically made sure that the technology was open so that Freesync wasn't just another competing proprietary standard.

I completely agree. I honestly don't think there will be any real life performance difference between the two. I would think that all gaming monitors, and most mid/high end monitors, will have variable framerate in 2-3 years, which will benefit all gamers. However I still think Adaptive Sync is the better of the two for other reasons:

  1. Adaptive Sync is an industry standard, so everyone can support it, including Intel and Nvidia. As such, g-sync is redundant. As you said, Nvidia can even still use their Gsync branding on the driverside.
  2. Adaptive Sync is cheaper to make, as it only requires an upgraded display controller IC, instead of an expensive FPGA with 750MB monitor ram, in the Gsync module. No royalty either. Certainly a new Adaptive Sync feature will carry a price premium at launch, but should lower relatively fast.
  3. Adaptive Sync will support native 24fps video playback. Gsync only goes to 30hz, so no go. This will make Adaptive Sync relevant to some video editors, the likes of Edzel, meaning AS can be relevant for other than gamers.

Either way, variable framerate, should be the holy grail of all gamers.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Just like you can't claim they are lies from a non-existing product.

 

We just don't know how well Freesync will stack up. AMD isn't telling any lies that I've seen. If you have, please post them (as that would surely be breaking news). Once we have an actual demo product to check out from Freesync, then we'll be able to determine how factual AMD was being.

"FreeSync progress is very good," Huddy explained. "At Computex in June this year we showed a prototype monitor which allowed a dynamic refresh rate just in the narrow band from 40Hz to 60Hz, so that gets rid of a great deal of tearing but doesn’t solve everything. 

"Die Demo läuft mit einer Bildwiederholfrequenz von 47 bis 48 Frames pro Sekunde und der Monitor liefert dank der angepassten Bildwiederhohlfrequenz ein einwandfreies Bild ab, das dem von Nvidias G-Sync sehr ähnlich sieht. Eine variierende Framerate, auf die der Monitor reagieren kann, lässt die Demo aktuell noch nicht zu. Während der Monitor während der Präsentation keine Probleme machte, stürzte die Techdemo von AMD zudem mehrfach ab."

"The demo runs at a refresh rate 47-48 frames per second and the monitor provides thanks to the custom frequency a perfect picture that is very similar to Nvidia G-Sync. Currently the demo does not have a varying framerate at which the monitor could react. While the monitor showed no problems during the presentation, the tech demo of AMD also crashed several times."

Computex demo: http://www.computerbase.de/2014-06/amd-freesync-monitore-guenstiger-als-g-sync/

Anandtech video showing a static refresh rate;

http://youtu.be/pIp6mbabQeM?t=28s

Go look at OCN threads, everyone is aware of their lies & false presentations

Link to comment
Share on other sites

Link to post
Share on other sites

It's polling the GPU to see if the monitor is in vblank state or not which causes a slight effect on your framerate. If it's not, you get a new frame on your display. When the monitor is done drawing the frame, it waits for the GPU to render the next frame so it's not spamming it. 

 

There's a difference between changing the static refresh rate and the frame refresh interval on the fly. Atm AC is static as you've been proved real-time twice.

Gsync works by modifying the vblank interval, using it as a delay between the current & next frame so you don't end up with stutter or whatever benefits it has but you do get a slightly higher input lag. All LCD panels have vblank interval, would be pointless trying to send frames during that with a new technology. Adaptive Sync modifies Vblank as well, using it as a delay. 

 

Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz. http://www.forbes.com/sites/jasonevangelho/2014/05/12/amds-project-freesync-gets-momentum-as-adaptive-sync-gets-added-to-displayport-spec/

You can't really defend PR lies from a non-existing product and tell the lies are better than an existing product.

  

Adaptive Sync works much easier than gsync: AD sends a frame within the stated min/max interval. After the frame is a vblank start signal, telling the monitor to hold the image. As a new frame is done, and within the min/max interval of the monitor, a vblank stop signal is sent followed by the frame. It's a simple 1 way communication, based on the monitors min/max interval, sent to the gfx upon connection (or startup). Much simpler and easier than gsyncs overcomplicated 2way commication method, including expensive buffer ram.

 

As for the idea that AS is static, watch this. Obviously it's not static. I can only repeat VESA's own official annoucement of their own standard:

Displayport: DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.

http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Frame by frame basis is not static.

 

The AS standard supports min max intervals from 9-240 hz. Since no monitors are over 144hz, no monitor will go over this. The potentional ranges are examples of what we can expect the vendors to support, but it is completely the choice of the vendors, not VESA or AMD. 23/24-144 wil probably be what we will see on an asus rog monitor. For standard monitors, maybe 23/24-60/75hz. Again it is completely up to the vendors of the display controller, to define the range supported. That is what open standards are for in terms of vendor freedom.

I chose 34/34 for playback support, but we now fps can dip lower, so the support could be lower as well. Let's the this or next month, when production prototypes are released for testing.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

As for the idea that AS is static, watch this. Obviously it's not static.

You were wrong and you remain wrong until you have some proper evidence that confirms dynamic refresh rate working. With some text you aren't getting anywhere when we had multiple sources & video's confirming the refresh rate was static proving AMD complety wrong along with vesa. 

Respond when you have evidence that confirms it, not some salestalk that has been proved wrong already.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Information regarding adaptive sync:

http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Read it. They would not announce it, if it wasn't 100% functional. It is one of the key point of adaptive sync.

 

Computer monitors normally refresh their displays at a fixed frame rate. In gaming applications, a computer’s CPU or GPU output frame rate will vary according to the rendering complexity of the image. If a display’s refresh rate and a computer’s render rate are not synchronized, visual artifacts—tearing or stuttering—can be seen by the user. DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.

 

In applications where the display content is static—such as surfing the web, reading email, or viewing a slide presentation—DisplayPort Adaptive-Sync allows the display refresh rate to be reduced seamlessly, lowering system power and extending battery life.

 

In addition to providing smoother video playback, the lower frame rate enabled by Adaptive-Sync also reduces power demand, extending battery life.

Instead of updating a monitor at a constant rate, Adaptive-Sync enables technologies that match the display update rate to the user’s content, enabling power efficient transport over the display link and a fluid, low-latency visual experience

Link to comment
Share on other sites

Link to post
Share on other sites

snip

A Different article:

"So now what? AMD is at Computex and of course is taking the opportunity to demonstrate a "FreeSync" monitor with the DisplayPort 1.2a Adaptive Sync feature at work. Though they aren't talking about what monitor it is or who the manufacturer is, the demo is up and running and functions with frame rates wavering between 40 FPS and 60 FPS..."

http://www.pcper.com/news/Graphics-Cards/AMD-Demonstrates-Prototype-FreeSync-Monitor-DisplayPort-Adaptive-Sync-Feature

Link to comment
Share on other sites

Link to post
Share on other sites

"The demo runs at a refresh rate 47-48 frames per second and the monitor provides thanks to the custom frequency a perfect picture that is very similar to Nvidia G-Sync. Currently the demo does not have a varying framerate at which the monitor could react. While the monitor showed no problems during the presentation, the tech demo of AMD also crashed several times."

Computex demo: http://www.computerbase.de/2014-06/amd-freesync-monitore-guenstiger-als-g-sync/

Anandtech video showing a static refresh rate;

http://youtu.be/pIp6mbabQeM?t=28s

Go look at OCN threads, everyone is aware of their lies & false presentations

 

Holy manipulation Batman.

The anandtech video, is just a proof of concept, that the Variable VBlank spec, excists in eDP, so there is no need for an overpriced gsync module. The point is to show that the hz can be manipulated to an odd number.

The second one, shows that ordinary monitors can support this as well without the module. Here's yet another existing off the shelf monitor, with a simple firmware update running it with a limited interval: https://www.youtube.com/watch?v=fZthhqmhbw8

 

Remember that all of these are just proof of concept. None of them are Adaptive Sync prototypes. It's like saying Star Citizen does not work and the devs are lying about it, because it crashes, and does'nt have all features yet in pre alpha. You cannot claim AS sucks, when no AS display controller prototypes are out yet. Yet you seem to discard, and ignore official statements about the standard, from the organization creating it, which will be supported in AS compatible display controllers. Those prototypes will be out this month or next.

You do know the difference between proof of concept, and prototypes right? 

 

You were wrong and you remain wrong until you have some proper evidence that confirms dynamic refresh rate working. With some text you aren't getting anywhere when we had multiple sources & video's confirming the refresh rate was static proving AMD complety wrong along with vesa. 

 

 

All credibility is gone, when you claim a standards organization is wrong about their own standard, based on variable vblank, which is also their own standard. I have explained how AS works, how it uses variable vblanks, the EXACT same tech, gsync is based on. Fair enough that you want full feature functional AS prototype, but they are not out just yet. But ignoring how the techs work, and claiming VESA is wrong, is just bonkers. You do know Nvidia is a member of VESA right?

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Read it. They would not announce it, if it wasn't 100% functional. It is one of the key point of adaptive sync.

If you'd read it, it was literally ported from eDP to DP1.2a. Haven't you seen my post in #40 where I showed a video with the laptop (not working) and the Computex demo (desk monitor) where theyve hidden FPS on purpose and that source found it out? Besides there's a difference between switching between different static refresh rates and dynamic adjusting on the fly.

It's obvious it doesn't have dynamic refresh rate or else they wouldn't have been hiding it.

Link to comment
Share on other sites

Link to post
Share on other sites

If you'd read it, it was literally ported from eDP to DP1.2a. Haven't you seen my post in #40 where I showed a video with the laptop (not working) and the Computex demo (desk monitor) where theyve hidden FPS on purpose and that source found it out? Besides there's a difference between switching between different static refresh rates and dynamic adjusting on the fly.

It's obvious it doesn't have dynamic refresh rate or else they wouldn't have been hiding it.

Where do you read that it was literally ported (copy pasted) from eDP to DP1.2a?

Static do not change. That is the point of static.

Link to comment
Share on other sites

Link to post
Share on other sites

If you'd read it, it was literally ported from eDP to DP1.2a. Haven't you seen my post in #40 where I showed a video with the laptop (not working) and the Computex demo (desk monitor) where theyve hidden FPS on purpose and that source found it out? Besides there's a difference between switching between different static refresh rates and dynamic adjusting on the fly.

It's obvious it doesn't have dynamic refresh rate or else they wouldn't have been hiding it.

 

There you go, spreading misinformation yet again.

No Adaptive Sync is nowhere near "literally ported from eDP". Not even close. AS is based on Variable VBlanks invented for powersaving on eDP. The plug and play min max interval handshake and the variable framerate is not supported in eDP, which is why the laptops in your example below, cannot run variable framerates (which wasn't even the point in that particular demo):

 

Anandtech video showing a static refresh rate;

http://youtu.be/pIp6mbabQeM?t=28s

 

 

You obviously have no intention of learning the differences, or care what other writes. You are constantly confusing and mixing up names, terms, technologies, standards, functionality, and refuse to learn how it really is. You are wasting everybodies time with your nonsense. I don't know if you are a fanboy, a troll or just extremely stubborn. But I thought people where in here to learn, not to spam misinformation.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×