Jump to content

[UPDATE] Official : FreeSync Launched, Dubbed Adaptive-Sync.

If adaptive sync(free-sync) is open and free doesn't that mean every GPU should be able to use it? So future Nvidia GPUs, AMD GPU/ APUs, and Intel Integrated Graphics.  Doesn't everyone win?

Yes that's the idea.

Link to comment
Share on other sites

Link to post
Share on other sites

welp, thats a new monitor and a GPU to buy.. i do have a r9 270 in my secondary rig tho.. we'll see what happens

RIG-Processor: Intel core i7 3770k @4.4GHz,Mobo: MSI Z77-G43,GPU:Gigabyte GTX 770, RAM:16 GB G-skill sniper f3,SSD: Corsair Force f3 240gb,HDD: Seagate baracuda 1TB,Cooler:CM Hyper 212 evo, Case: Sharkoon T28 Blue

Peripherals- Monitor: Samsung S24B300, Keyboard: Razer Blackwidow, Mouse: Razer Abyssus, Headphones: Razer Megalodon, Mousepad: Razer Goliathus Alpha, Webcam: Logitech C270,Pad:Logitech F710, Sp: Philips generic ones

#KILLEDMYWIFE #MAKEBOMBS

Link to comment
Share on other sites

Link to post
Share on other sites

welp, thats a new monitor and a GPU to buy.. i do have a r9 270 in my secondary rig tho.. we'll see what happens

Well. That's only if you really need it now. Wait for your next upgrade :D

If they don't have display port now than they won't have it later.(They would also need to have different software to enable support)
The hole point of the console is that it doesn't change.
They would never make a different version that supports more features than the older version.

The standard may be free but not the tech inside.
The module to enable it is more complex than a normal one because it has to communicate with the GPU.
So they have to increase the price if they want to have the same profit.

Do you even know what DP rides on? The digital signal without the scaler chip, most of the time so that's even cheaper to make which with HDMI you need to pay a royalty fee.

And the chips in current consoles are FROM AMD BASED ON GCN 1.1/2.0 WHICH HAS DP 1.2A SUPPORT.

 

As in anything HIGHER than a 260. 260X, 270, 270X, 280, 280X, 290, 290X... Then the other cards with that architecture, so most likely the 7000 series too...

I think it's only GCN1.1? Snicker snicker. 265 to 280x are GCN1.0. 7790 is GCN1.1

 

 

Hopefully nvidia will have something to respond to this with

Have you heard that it's a industry standard?
Link to comment
Share on other sites

Link to post
Share on other sites

I would rather stick with NVidia and get G-sync with it. Not saying that I am choosing NVidia for G-sync, but this new "A.S." isn't enough for me to make the switch. 

Here is a list of common fallacies. Which ones have you used today?

Link to comment
Share on other sites

Link to post
Share on other sites

Any word on how nice it plays with strobing backlights (LightBoost)?

 

Chances are, it won't. If you are strobing at a variable rate then you have to constantly account for brightness and color shift. Not to mention strobing works best at 100+Hz. While there are 60Hz implementations, there's the issue of flicker sensitivity and fatigue. Personally, I find 60Hz CRTs painful to look at.

 

With the gsync kit you can either run in ulmb mode or gsync. Not both. Ulmb is a fixed strobe setting that corrects color shift.

 

Even with those limitations the gsync module has a massive fpga with 3x256MB ram chips(mostly for memory bandwidth, not capacity).

main(i){for(;i<101;i++)printf("Fizz\n\0Fizzz\bBuzz\n\0%d\n"+(!(i%5)^!!(i%3)*3)*6,i);}

Link to comment
Share on other sites

Link to post
Share on other sites

Chances are, it won't. If you are strobing at a variable rate then you have to constantly account for brightness and color shift. Not to mention strobing works best at 100+Hz. While there are 60Hz implementations, there's the issue of flicker sensitivity and fatigue. Personally, I find 60Hz CRTs painful to look at.

 

With the gsync kit you can either run in ulmb mode or gsync. Not both. Ulmb is a fixed strobe setting that corrects color shift.

 

Even with those limitations the gsync module has a massive fpga with 3x256MB ram chips(mostly for memory bandwidth, not capacity).

 

Thank you, I wasn't considering the stresses that a variable strobe rate would introduce, especially at lower Hz/fps.

 

http://youtu.be/gbW9IwVGpX8?t=4m

 

At 4 minutes Carmack talks about a crossover point between ULMB & G-Sync, so I guess it may just take some time to iron out the implementation. What do you think?

Link to comment
Share on other sites

Link to post
Share on other sites

And the chips in current consoles are FROM AMD BASED ON GCN 1.1/2.0 WHICH HAS DP 1.2A SUPPORT.

That doesn't matter they would need to release an software update for that version and put a display port into the console.

Console games are made with the FPS in mind they even cut out single assets/textures/effects to get a steady FPS that's why you see almost no FPS jumps in console games.

Devs would need to make a v-sync free version of the game that only runs like that on the updated console.

Their is now way this is going to happen on consoles.

Also SONY owns part of HDMI so they aren't paying fees they are getting money from the fees.

So they will naturally make their console HDMI only so that TV makers use HDMI.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, is the 280X not supported in this? Im gonna poop a shit if it isnt... It doesn't say the 280X, only the 290, 290X, 270, and 270X

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think the 280x is supported as its just a 7970. I don't think its has the newest dp version to use this

Link to comment
Share on other sites

Link to post
Share on other sites

Thank you, I wasn't considering the stresses that a variable strobe rate would introduce, especially at lower Hz/fps.

 

http://youtu.be/gbW9IwVGpX8?t=4m

 

At 4 minutes Carmack talks about a crossover point between ULMB & G-Sync, so I guess it may just take some time to iron out the implementation. What do you think?

 

Sure. The limitations gsync has were most likely driven by cost. The Altera Arria V GX FPGA isn't cheap as it is. The cheapest one on Mouser is $371

main(i){for(;i<101;i++)printf("Fizz\n\0Fizzz\bBuzz\n\0%d\n"+(!(i%5)^!!(i%3)*3)*6,i);}

Link to comment
Share on other sites

Link to post
Share on other sites

Doenst this technology only use DisplayPort to be actively usable? While G-sync does require broader display connections like HDMI and DVI and the likes. Whatever is compatible.

Using Tapatalk

Link to comment
Share on other sites

Link to post
Share on other sites

I remember how Linus criticized AMD for FreeSync when it was first announced and only citing Nvidia's response which was a bullcrap "we're the only company that has a scalar that can do this" which as we know was a complete lie.

Dynamic refresh rate has existed since 2009 in embedded display port as a firmware feature. A scalar was never needed as is evident today with this announcement.

 

 

Interested to see the clip where I criticized FreeSync :)

 

I did pass along NVIDIA's response, but that was because curiously AMD ONLY briefed Anandtech on the technology back at CES.

 

Here's my excited Facebook post from the initial announcement: https://www.facebook.com/LinusTech/posts/554303207999462?stream_ref=5

 

Rawr.

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, is the 280X not supported in this? Im gonna poop a shit if it isnt... It doesn't say the 280X, only the 290, 290X, 270, and 270X

I think it's only supported on the newer GPUs which came out on 2013. And the 280x is a reclocked 7970... Btw 270 series is not supported either, the 260 range is.

Edit- I also hope that this will push display port adoption.

Link to comment
Share on other sites

Link to post
Share on other sites

That doesn't matter they would need to release an software update for that version and put a display port into the console.

Console games are made with the FPS in mind they even cut out single assets/textures/effects to get a steady FPS that's why you see almost no FPS jumps in console games.

Devs would need to make a v-sync free version of the game that only runs like that on the updated console.

Their is now way this is going to happen on consoles.

Also SONY owns part of HDMI so they aren't paying fees they are getting money from the fees.

So they will naturally make their console HDMI only so that TV makers use HDMI.

Even sony is paying the royalty fee you know.

 

Doenst this technology only use DisplayPort to be actively usable? While G-sync does require broader display connections like HDMI and DVI and the likes. Whatever is compatible.

But a G-sync monitor is a lot of extra money. DP should be the port we are using but everyone's too hung up on HDMI and DVI to move on, DP is significantly smaller in size, is secure in it's slot.

I think it's only supported on the newer GPUs which came out on 2013. And the 280x is a reclocked 7970... Btw 270 series is not supported either, the 260 range is.

Edit- I also hope that this will push display port adoption.

280x is a higher bin 7970 though with added boost firmware. Recently there had been Tahiti XTL ASICs running around that pull only 130-150W at full load. Still it's a GCN1
 
Same here.
Link to comment
Share on other sites

Link to post
Share on other sites

 

Even sony is paying the royalty fee you know.

 

But a G-sync monitor is a lot of extra money. DP should be the port we are using but everyone's too hung up on HDMI and DVI to move on, DP is significantly smaller in size, is secure in it's slot.

280x is a higher bin 7970 though with added boost firmware. Recently there had been Tahiti XTL ASICs running around that pull only 130-150W at full load. Still it's a GCN1
 
Same here.

 

 

I have a Club 3d 280x, with the XTL revision. 1200 core / 1700 memory it draws about 190-200w during Fire Strike Extreme. So that's quite a difference to a 7970, so I can vouch for this.

FX 6300 @4.8 Ghz - Club 3d R9 280x RoyalQueen @1200 core / 1700 memory - Asus M5A99X Evo R 2.0 - 8 Gb Kingston Hyper X Blu - Seasonic M12II Evo Bronze 620w - 1 Tb WD Blue, 1 Tb Seagate Barracuda - Custom water cooling

Link to comment
Share on other sites

Link to post
Share on other sites

Why not compatible with r9 280x? Damn amd come on!

Link to comment
Share on other sites

Link to post
Share on other sites

 

Even sony is paying the royalty fee you know.

 

But a G-sync monitor is a lot of extra money. DP should be the port we are using but everyone's too hung up on HDMI and DVI to move on, DP is significantly smaller in size, is secure in it's slot.
 

No Sony owns HDMI together with Panasonic,Philips,Toshiba and a few others.

The fee goes "TO SONY" and partners they don't pay it.

So Sony will naturally put only HDMI in their system just like with Blu-Ray.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Q: What are the requirements to use FreeSync?

A: To take advantage of the benefits of Project FreeSync, users will require... a compatible AMD Radeon GPU with a DisplayPort connection.

 

 

 

Ok bish please.

Someone pm me when an actual sync technology is developed for monitor market as standard.

 

This propietary bs will never end,gaming tech will never see full potential sadly.

Link to comment
Share on other sites

Link to post
Share on other sites

Ok bish please.

Someone pm me when an actual sync technology is developed for monitor market as standard.

 

This propietary bs will never end,gaming tech will never see full potential sadly.

Adaptive-Sync is completely open source. FreeSync refers to AMD's specific software ecosystem that utilizes Adaptive-Sync.

It's basically a brand name that AMD has given to it's adaptation of the technology to sort of illustrate it as a bullet point on the graphics card box so to speak.

Link to comment
Share on other sites

Link to post
Share on other sites

I saw my gpu outputting just under 160 fps while playing GTA IV last night. And many monitors support above 140 fps? Did all of these monitors suddenly poof into existance over the past 3 months, because as of March I was only able to find a small selection of them and they wanted about $100 more for them. Most monitors are still locked at 60 FPS

 

CRT's can easily run at 140hz+, i've seen people running at 200hz. So, no they didn't just poof into exsistance over the past 3 months.

 

Adaptive-Sync is completely open source. FreeSync refers to AMD's specific software ecosystem that utilizes Adaptive-Sync.

It's basically a brand name that AMD has given to it's adaptation of the technology to sort of illustrate it as a bullet point on the graphics card box so to speak.

 

Yeah, i'd give it a short while after it being released before someone manages to force it to work on Nvidia. NVIDIA could allow it themselves, but that'd basically be them admitting that g-sync has failed.

My Build

 

GPU: MSI GTX 1080 ARMOUR | CPU: i7 9700k | Ram: 16gb 3200mhz Motherboard: ASUS Maximus XI Gene | Storage: 2x 1TB NVME 1x 500GB NVME 1x 120GB NVME | Case: Corsair 570X

 

Link to comment
Share on other sites

Link to post
Share on other sites

Interested to see the clip where I criticized FreeSync :)

 

I did pass along NVIDIA's response, but that was because curiously AMD ONLY briefed Anandtech on the technology back at CES.

 

Here's my excited Facebook post from the initial announcement: https://www.facebook.com/LinusTech/posts/554303207999462?stream_ref=5

 

Rawr.

I don't mean to be the devils advocate but... the user mentioned you criticized AMD, not Freesync... I also recall it, that's why I'm doing this post (I'm usualy a lurker), it was in this section of the WAN SHOW :

http://youtu.be/cmuxVKCG5ws?t=1h29m54s

I recall it cause it was the only time I really disliked something you said (not bad since I follow you for a long time :D), I wasn't expecting you as an oppinion leader stating that "(...)nvidia pushes things forward(...)" - when, in my oppinion, alot of times they push it sideways, but that's my 2 cents - and "(...)open standard this, open standard that... sometimes that's a way of saying "we don't have the resources, the time, or wasn't important enough to develop it(...)we found this thing that might sort of work(...)"" when there are efforts, either big or small ones, made by AMD to support such solutions. You might be talking about the 3D solutions, but it comes following the subject of the G-Sync Vs Freesync, Proprietary Vs Standard and you kinda randomized it in the end...

You were 100% right when you said that AMD limited the information when they announced FreeSync, yet that's one reason to give the benefit of the doubt.

Keep up the good work!

Link to comment
Share on other sites

Link to post
Share on other sites

I don't mean to be the devils advocate but... the user mentioned you criticized AMD, not Freesync... I also recall it, that's why I'm doing this post (I'm usualy a lurker), it was in this section of the WAN SHOW : http://youtu.be/cmuxVKCG5ws?t=1h29m54s

I recall it cause it was the only time I really disliked something you said (not bad since I follow you for a long time :D), I wasn't expecting you as an oppinion leader stating that "(...)nvidia pushes things forward(...)" - when, in my oppinion, alot of times they push it sideways, but that's my 2 cents - and "(...)open standard this, open standard that... sometimes that's a way of saying "we don't have the resources, the time, or wasn't important enough to develop it(...)we found this thing that might sort of work(...)"" when there are efforts, either big or small ones, made by AMD to support such solutions. You might be talking about the 3D solutions, but it comes following the subject of the G-Sync Vs Freesync, Proprietary Vs Standard and you kinda randomized it in the end...

You were 100% right when you said that AMD limited the information when they announced FreeSync, yet that's one reason to give the benefit of the doubt.

Keep up the good work!

He criticized one particular thing AMD has done (their terrible version of 3D vision). What's wrong with criticizing them for a genuinely bad product? He criticized a specific product, not AMD or G-sync.

"A few years from now I'd love to see it implemented across the board"

-Linus

That sounds like praise to me.

The rest of the things he said about it were positive or neutral.

Link to comment
Share on other sites

Link to post
Share on other sites

That's why I said that he randomized in the end, wich he did when he said "(...)open standard this, open standard that... sometimes that's a way of saying "we don't have the resources, the time, or wasn't important enough to develop it(...)we found this thing that might sort of work(...)"... that's not refering to ONE particular situation, and such situations are pretty much similar to Freesync (where the statements started)... in fact in the end he clearly states that HSA is not one of those situations, what about FreeSync, the main subject of that part of the discussion?

I wouldn't post if we were talking about semantics... if it was misunderstanding, it's possible, but I was not the only one who did misunderstood.

Link to comment
Share on other sites

Link to post
Share on other sites

So will this new technology ALSO require new video cards (not yet released) that have a newer DisplayPort?  Or is it just new monitors and new DisplayPort cables that need to be purchased?

CPU: AMD Ryzen 7 7800x3d  Motherboard:  Gigabyte B650 AORUS Elite  RAM:  Vengeance 2 x 16GB DDR5 6000   GPU:  Zotac RTX 4090

Storage:  M.2 Samsung Evo 860 TB / Samsung Evo 840 500GB   Case:  be quiet Dark Base 900   PSU: Corsair SHIFT RM1000x  Display:  ASUS AW3423DW QD-OLED 34" 3440x1440 Ultrawide w/ GSYNC

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×