Jump to content

AMD Announces 'FreeSync 2'

HKZeroFive
12 minutes ago, Prysin said:

if you want an answer, check out the specs for the PS4 Pro. That uses a Polaris-Vega bastardisation. I assume that is built upon the hardware needed for full "Freesync 2" GPU compliancy

Seems like it's HDR10, which would make sense since it's the most common one right now.

 

 

4 minutes ago, patrickjp93 said:

And no, 4-bit LUT is the minimum HDR requirement.

I think you mean 10-bit LUT.

A 4 bit LUT would result in your monitor only being able to display ~4000 colors, which most certainly isn't HDR-worthy (it would also look like complete crap).

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, LAwLz said:

~4000 colors

that would be 12 bit , that would be an AMIGA , congrats

Amiga500_system.jpg

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

@patrickjp93

I forgot, since you are unable to find sources yourself.

 

GPU: Codename Ellesmere... 

https://www.techpowerup.com/gpudb/2876/playstation-4-pro-gpu

 

GPU: Codename Ellesmere...

https://www.techpowerup.com/gpudb/2848/radeon-rx-480

 

OMG, its the same GPU codename... holy SHIT... incredible.

 

cannot be arsed to google for all the reviews and interviews with sony stating it has parts of VEGA in it.

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, LAwLz said:

Seems like it's HDR10, which would make sense since it's the most common one right now.

 

 

I think you mean 10-bit LUT.

A 4 bit LUT would result in your monitor only being able to display ~4000 colors, which most certainly isn't HDR-worthy (it would also look like complete crap).

Nope, 4-bit, or 4000x the color range visible by non-HDR panels. It's 4 bits of color information beyond raw color, and it works for both 10-bit and 8-bit color depth.

 

LTT truly has become pathetic. At least when OpCode was here I had a worthy opponent who provided sources I'd missed.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Space Reptile said:

that would be 12 bit , that would be an AMIGA , congrats

It's 4 bits per channel, so 4 for red, 4 for green, 4 for blue (unless you ask Patrick, which will say there are 4 color channels) for a total of 12 bits.

 

Edit: Just realized you were agreeing with me.

Yes, 4 bit lookup is what the AMIGA used. Also the Commodore 64 IIRC.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Prysin said:

@patrickjp93

I forgot, since you are unable to find sources yourself.

 

GPU: Codename Ellesmere... 

https://www.techpowerup.com/gpudb/2876/playstation-4-pro-gpu

 

GPU: Codename Ellesmere...

https://www.techpowerup.com/gpudb/2848/radeon-rx-480

 

OMG, its the same GPU codename... holy SHIT... incredible.

 

cannot be arsed to google for all the reviews and interviews with sony stating it has parts of VEGA in it.

And now you've become both defensive and petty, pathetic...

 

That's PS4 Pro. You said PS4 in your original post, or would you care to follow the recursion?

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, LAwLz said:

It's 4 bits per channel, so 4 for red, 4 for green, 4 for blue (unless you ask Patrick, which will say there are 4 color channels) for a total of 12 bits.

No, that's 12 bits per channel. 24-bit color is 8 per channel with no gamma. My God you truly are either stupid or ignorant...

 

8 per channel is capable of 0,0,0 - 255,255,255 in color depth. That's been possible since the early 2000s. I find it strange that I can be both 12 drinks into an evening and be better informed than the 2nd and 3rd best minds on this forum. At least 4 months ago at least Prysin could beat me in this state and you could keep pace.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, patrickjp93 said:

And both sources agree with me, how convenient.

No they don't, they say 1TFlops max on a configuration not used by G-Sync. Stratix 10 has a max of 10TFlops which as stated is 10x more than Stratix 5 or did you completely miss that.

 

Quote

Current FPGAs have capabilities of 1+ peak TFLOPs (1), while AMD and Nvidia’s latest GPUs are rated even higher, up to nearly 4 TFLOPs. However, the peak GFLOPs or TFLOPs provides little information on the performance of a given device in a particular application.

https://www.altera.com/content/dam/altera-www/global/en_US/pdfs/literature/wp/wp-01197-radar-fpga-or-gpu.pdf

 

Quote

While not demonstrated here, early power estimators are available from FPGA vendors. A design using all of the resources and fMAX defined in Table 3 would have power consumption in the hundreds of watts. This particular FPGA is intended for ASIC prototyping applications, where clock rates and toggle rates are low. High clock rates and toggle rates will dramatically increase power consumption beyond the VCC current and power dissipation capabilities of the device.

Quote

Using logic-lock and Design Space Explorer (DSE) optimizations, the seven-core design can approach the fMAX of the single-core design, boosting it to over 500 GFLOPs, with over 10 GFLOPs/W using 28 nm FPGAs

 

https://www.altera.com/content/dam/altera-www/global/en_US/pdfs/literature/wp/wp-01222-understanding-peak-floating-point-performance-claims.pdf

 

Unless your G-Sync monitor has the same cooling as a GPU it does not have even close to these kinds of performance levels. It would melt your screen and use hundreds of watts.

 

Seriously quote the exact source of your claim in line and followed by the original link. If not I'm just going to mark every post you make as funny (meaning factually) wrong and not even bother, your wasting mine and everyone's time.

 

30 minutes ago, patrickjp93 said:

Except AMD's tech is horrendously outdated and already beaten.

At least AMD GPUs have had 10bit colour support since before 2008, on all cards not just professional ones. Matrox has been using AMD GPUs for this very reason and they specialize in broadcast and medical imaging, where colour matters. As for outdated, heh that's only opinion.

https://www.amd.com/Documents/10-Bit.pdf

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, leadeater said:

No they don't, they say 1TFlops max on a configuration not used by G-Sync. Stratix 10 has a max of 10TFlops which as stated is 10x more than Stratix 5 or did you completely miss that.

 

https://www.altera.com/content/dam/altera-www/global/en_US/pdfs/literature/wp/wp-01197-radar-fpga-or-gpu.pdf

 

 

https://www.altera.com/content/dam/altera-www/global/en_US/pdfs/literature/wp/wp-01222-understanding-peak-floating-point-performance-claims.pdf

 

Unless your G-Sync monitor has the same cooling as a GPU it does not have even close to these kinds of performance levels. It would melt your screen and use hundreds of watts.

 

Seriously quote the exact source of your claim in line and followed by the original link. If not I'm just going to mark every post you make as funny (meaning factually) wrong and not even bother, your wasting mine and everyone's time.

 

At least AMD GPUs have had 10bit colour support since before 2008, on all cards not just professional ones. Matrox has been using AMD GPUs for this very reason and they specialize in broadcast and medical imaging, where colour matters. As for outdated, heh that's only opinion.

https://www.amd.com/Documents/10-Bit.pdf

No, if you read properly they say 1.2 DSP and 1.8 in logical elements, or 3.0 total.

 

My God have LTT's standards fallen.

 

No, because 5W does not melt a monitor.

 

Use your own link, donkey. It quotes me exactly.

 

So have Nvidia's. Nvidia just chooses to reserve it to professional drivers, or need I remind you Nvidia got there first with Fermi?

 

Matrox uses AMD because it's owned by Abu Dhabi, same as AMD and GloFo.

 

No, it's outdated because it's outdated.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, patrickjp93 said:

Nope, 4-bit, or 4000x the color range visible by non-HDR panels. It's 4 bits of color information beyond raw color, and it works for both 10-bit and 8-bit color depth.

 

LTT truly has become pathetic. At least when OpCode was here I had a worthy opponent who provided sources I'd missed.

 

18 minutes ago, patrickjp93 said:

No, that's 12 bits per channel. 24-bit color is 8 per channel with no gamma. My God you truly are either stupid or ignorant...

I don't even know how to reply to this, because it truly is the rambling of a madman.

It's hard to make a counterargument when you're just making stuff up and your entire post is full of completely inane comments and statements.

 

There is no such thing as "raw color" (unless it is a made up term for something which has a real name, like "true color" or "deep color"). Do you mean a raw image format? Because they are usually 10 or 12 bit. By "raw color" do you mean 8-bit color (aka true color)?

If you meant to say "it's 4 bits beyond true color" then I would agree, because true color is 8 bits, so 4 bits beyond that would be 12 bit color depth. Although I would still disagree with you somewhat because 10 bits is currently the lowest standard for HDR classified content.

 

Gamma still isn't a channel, so there is no "it's 8 bits per channel with no gamma". Like I explained before, you were thinking of the alpha channel, but that controls transparency, not color.

 

I really can't tell if you actually believe your own bullshit, or if you're just trolling.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, LAwLz said:

 

I don't even know how to reply to this, because it truly is the rambling of a madman.

It's hard to make a counterargument when you're just making stuff up and your entire post is full of completely inane comments and statements.

 

There is no such thing as "raw color" (unless it is a made up term for something which has a real name, like "true color" or "deep color"). Do you mean a raw image format? Because they are usually 10 or 12 bit.

Gamma still isn't a channel, so there is no "it's 8 bits per channel with no gamma". Like I explained before, you were thinking of the alpha channel, but that controls transparency, not color.

 

I really can't tell if you actually believe your own bullshit, or if you're just trolling.

Alpha channel controls color and you know it. It's a blending factor, be it a background of black or otherwise.

 

Sorry but you've been wrong 13 times out of our last 13 contentions, and this is no different.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LAwLz said:

-snip-

Just ignore him, it's derailing the thread and if he continues I'll flag it as trolling. I've already overlooked the multiple personal attack remarks he's made.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, patrickjp93 said:

Alpha channel controls color and you know it. It's a blending factor, be it a background of black or otherwise.

It controls color by making it transparent or not (which is completely different from what you were talking about), and it does not get sent to your monitor (because your monitor can't turn transparent).

Honestly, I have to stop now because I will annoy people around me if I start laughing, and I genuinely can't hold it back for much longer. This thread is really entertaining.

 

Feel free to reply though.

I don't care about personal attacks (dunno if the mods are OK with it if I say I am OK with it).

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, LAwLz said:

It controls color by making it transparent or not (which is completely different from what you were talking about), and it does not get sent to your monitor (because your monitor can't turn transparent).

Honestly, I have to stop now because I will annoy people around me if I start laughing, and I genuinely can't hold it back for much longer. This thread is really entertaining.

 

Feel free to reply though.

I don't care about personal attacks (dunno if the mods are OK with it if I say I am OK with it).

No need to attack a man who's already beaten.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, patrickjp93 said:

And now you've become both defensive and petty, pathetic...

 

That's PS4 Pro. You said PS4 in your original post, or would you care to follow the recursion?

you will find that i am neither, and that instead it is YOU who should invest in some reading glasses, and or contacts.

here is a screenshot if your OWN reply. with my quote.

 

1h52gBs.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Prysin said:

you will find that i am neither, and that instead it is YOU who should invest in some reading glasses, and or contacts.

here is a screenshot if your OWN reply. with my quote.

 

 

I find you are both and that I stand correct.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, patrickjp93 said:

Nvidia got there first with Fermi

Fermi first seen in GeForce 400 series, released April 2010. Unless time doesn't work the way I think it does dates before 2008 come before 2010? Also ATI Radeon HD 3000 series had 10 bit colour which was released in 2007, 3 years before nvidia.

 

A better counter would have been to say you could only use it in D3D applications and Adobe locked out 10 bit support for non FirePro cards, the hardware was still capable of it however.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

Fermi first seen in GeForce 400 series, released April 2010. Unless time doesn't work the way I think it does dates before 2008 come before 2010? Also ATI Radeon HD 3000 series had 10 bit colour which was released in 2007, 3 years before nvidia.

 

A better counter would have been to say you could only use it in D3D applications and Adobe locked out 10 bit support for non FirePro cards, the hardware was still capable of it however.

No and no.

 

My my you've gotten desperate. Radeon 3000 did not have it at all. ATI claimed it and was proven to have buggy garbage hardware.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, patrickjp93 said:

I find you are both and that I stand correct.

You're also wrong about the OG ps4.  It is based on Pitcairn and Hawaii not tonga.  Ps4 OG was in development and launched before Hawaii. Tonga launched in 2014, Q3.  Og PS4 launched Nov 2013. Meaning tonga was still in development. 

 

PS4 OG is mostly GCN 1, it is based on the 7850, but with a custom prototype of the  True audio dsp alongside the media decoders from GCN2. 

 

you might be right regarding the Slim version though 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Prysin said:

you might be right regarding the Slim version though 

Slim is only a smaller form factor of OG, no new specs whatsoever

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, patrickjp93 said:

No and no.

 

My my you've gotten desperate. Radeon 3000 did not have it at all. ATI claimed it and was proven to have buggy garbage hardware.

Garbage GPU yes, had 10 bit support yes. HD 4000 series also had it, released 2008.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Prysin said:

You're also wrong about the OG ps4.  It is based on Pitcairn and Hawaii not tonga.  Ps4 OG was in development and launched before Hawaii. Tonga launched in 2014, Q3.  Og PS4 launched Nov 2013. Meaning tonga was still in development. 

 

PS4 OG is mostly GCN 1, it is based on the 7850, but with a custom prototype of the  True audio dsp alongside the media decoders from GCN2. 

 

you might be right regarding the Slim version though 

In other words, Hawaii and Tonga with a clock deficiency and no, it launched AFTER the 290/X. And that actually lends it to being a Hawaii/Tonga cross-breed.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

Garbage GPU yes, had 10 bit support yes. HD 4000 series also had it, released 2008.

Yes and no. And no, the 4000 series also had bugged, useless "support" for it. ATI had a ton of bravado and no real substance. Nvidia got there first with Fermi in a completely bug-free scenario.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I still don't see what this discussion has got to do with AMD changing the order of events in the display pipeline...

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Fetzie said:

I still don't see what this discussion has got to do with AMD changing the order of events in the display pipeline...

I'm normally above being this vindictive, but the idea you think that matters proves you don't have a clue about the rendering pipeline and don't realize it's meaningless.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×