Jump to content

MPG 321URX QD-OLED Incorrect claims?

Maybe with DSC? Which means we’d be better off using HDMI? Would the same amount of compression be used?

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, CoolJosh3k said:

Maybe with DSC? Which means we’d be better off using HDMI? Would the same amount of compression be used?

Yes, with dsc. Plouff just did a short circuit today covering this monitor. 

 

 

 

Community Standards | Fan Control Software

Please make sure to Quote me or @ me to see your reply!

Just because I am a Moderator does not mean I am always right. Please fact check me and verify my answer. 

 

"Black Out"

Ryzen 9 5900x | Full Custom Water Loop | Asus Crosshair VIII Hero (Wi-Fi) | RTX 3090 Founders | Ballistix 32gb 16-18-18-36 3600mhz 

1tb Samsung 970 Evo | 2x 2tb Crucial MX500 SSD | Fractal Design Meshify S2 | Corsair HX1200 PSU

 

Dedicated Streaming Rig

 Ryzen 7 3700x | Asus B450-F Strix | 16gb Gskill Flare X 3200mhz | Corsair RM550x PSU | Asus Strix GTX1070 | 250gb 860 Evo m.2

Phanteks P300A |  Elgato HD60 Pro | Avermedia Live Gamer Duo | Avermedia 4k GC573 Capture Card

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Skiiwee29 said:

Yes, with dsc. Plouff just did a short circuit today covering this monitor. 

 

 

 

That is why I am here actually.

Since it has to use DSC, but HDMI 2.1 has more bandwidth, would it give a better image over HDMI?

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, CoolJosh3k said:

That is why I am here actually.

Since it has to use DSC, but HDMI 2.1 has more bandwidth, would it give a better image over HDMI?

No, DSC is a visually lossless codec, you cannot tell the difference no matter how good your eyes are. 4K 240 has been doable for a while now on the Odyssey G7 Neo and G8 Neo. Its a known thing. 

 

Not to mention you wouldn't get a better image between them in any case, you'd either get the resolution, refresh rate and bit depth you're after or it wouldn't work at all. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, CoolJosh3k said:

That is why I am here actually.

Since it has to use DSC, but HDMI 2.1 has more bandwidth, would it give a better image over HDMI?

No because it too uses dsc to get the full 240hz. HDMI 2.1 is 48gbps. To do 4k 240hz uncompressed you need about 55gbps. Which only dp2.0 can do currently. 

Community Standards | Fan Control Software

Please make sure to Quote me or @ me to see your reply!

Just because I am a Moderator does not mean I am always right. Please fact check me and verify my answer. 

 

"Black Out"

Ryzen 9 5900x | Full Custom Water Loop | Asus Crosshair VIII Hero (Wi-Fi) | RTX 3090 Founders | Ballistix 32gb 16-18-18-36 3600mhz 

1tb Samsung 970 Evo | 2x 2tb Crucial MX500 SSD | Fractal Design Meshify S2 | Corsair HX1200 PSU

 

Dedicated Streaming Rig

 Ryzen 7 3700x | Asus B450-F Strix | 16gb Gskill Flare X 3200mhz | Corsair RM550x PSU | Asus Strix GTX1070 | 250gb 860 Evo m.2

Phanteks P300A |  Elgato HD60 Pro | Avermedia Live Gamer Duo | Avermedia 4k GC573 Capture Card

 

Link to comment
Share on other sites

Link to post
Share on other sites

DSC is considered visually lossless and doesn't introduce any additional input lag, so you wouldn't be able to tell a difference anyway.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Skiiwee29 said:

No because it too uses dsc to get the full 240hz. HDMI 2.1 is 48gbps. To do 4k 240hz uncompressed you need about 55gbps. Which only dp2.0 can do currently. 

That would be SDR so HDR it would be around 70Gbps basically DP 2.0 cap.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, GuiltySpark_ said:

No, DSC is a visually lossless codec, you cannot tell the difference no matter how good your eyes are. 4K 240 has been doable for a while now on the Odyssey G7 Neo and G8 Neo. Its a known thing. 

 

Not to mention you wouldn't get a better image between them in any case, you'd either get the resolution, refresh rate and bit depth you're after or it wouldn't work at all. 

I worry about if DSC really is visually lossless based on "when all the observers fail to correctly identify the reference image more than 75% of the trials". from https://en.wikipedia.org/wiki/Display_Stream_Compression

 

Also I hoped that since “DSC can work in constant or variable bitrate mode.” Also from https://en.wikipedia.org/wiki/Display_Stream_Compression meant that it would use the higher bitrate of the HDMI for better results.

 

Maybe I am just getting too technical here?

Link to comment
Share on other sites

Link to post
Share on other sites

This might all be concerning if DSC was a new tech that no one has used before. Meanwhile it’s been part of many displays going back nearly a decade. 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, CoolJosh3k said:

I worry about if DSC really is visually lossless based on "when all the observers fail to correctly identify the reference image more than 75% of the trials". from https://en.wikipedia.org/wiki/Display_Stream_Compression

 

Also I hoped that since “DSC can work in constant or variable bitrate mode.” Also from https://en.wikipedia.org/wiki/Display_Stream_Compression meant that it would use the higher bitrate of the HDMI for better results.

 

Maybe I am just getting too technical here?

If they fail to identify it 75% of the time then it's pretty safe they're just guessing. DSC is visually lossless, meaning even if you try, you won't know if it's used. Where has all this discourse about DSC being an issue come from lately? It's been used very frequently in higher end monitors for at least 5 years now. It won't impact picture quality, input lag or anything else. I've used several monitors that make use of DSC in the past and none of them showed any signs of compression.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Come to think of it, what part of the graphics card handles the compression?

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, CoolJosh3k said:

Come to think of it, what part of the graphics card handles the compression?

A dedicated chip. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, GuiltySpark_ said:

A dedicated chip. 

Do you (or anyone else) know if it counts towards the 3 stream limit of the NVENC chip? Or it is a part of the card dedicated to just doing DSC?

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, CoolJosh3k said:

Do you (or anyone else) know if it counts towards the 3 stream limit of the NVENC chip? Or it is a part of the card dedicated to just doing DSC?

It's done in dedicated hardware which is part of the DisplayPort transmitter. It isn't done by NVENC.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×