Jump to content

ELMB (not sync) with HDR

BTGbullseye

Does ELMB prevent HDR without VRR enabled? I can't find anything online about it, as everything exclusively talks about ELMB sync with HDR. I don't care about VRR, I do care about HDR and motion blur.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

ELMB strobes the backlight which reduce max brightnes, how much depends on the pulse width .but regardless ..HDR requires a high brightness and since most monitors dont get all the bright in regards to HDR ..usually hovering around 500nits at best, activating backlight strobing will cut that significantly.

 

So yes it most likely immediately disables HDR capability for that very reason.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

Seems a bit arbitrary to me. There doesn't seem to be any technological reason to prevent it, just a "HDR should be eye searing" philosophy.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, BTGbullseye said:

Seems a bit arbitrary to me. There doesn't seem to be any technological reason to prevent it, just a "HDR should be eye searing" philosophy.

Well there is.

 

As i explained, flickering the backlight , which is what ELMB is, reduces brightness. If brightness is reduced you cant get HDR, High Dynamic range is the difference between the brightest part and darkest part of an image, if brightness cant get ...bright.., u cant have HDR.

Having a HDR signal sent to a display incapable of producing HDR is pointless, ull end up with massive amounts of clipping which will look worse than an SDR signal.

 

By definition, HDR is more or less 'eye searing' ..its the point of HDR, to enable high brightness and low darkness. HDR tries to replicate real life, if u try looking at the sun (dont) ..well THATS eye searing ..HDR tries to emulate that by making bright objects like suns, as bright as possible.

If you play a game walking through a dark dense forest then come to a clearing into bright sunshine, in a real life setting ur eyes need time to adjust, you squint as initially its to bright for you. Games currently 'fake' this by slowly increasing the brightness of that scene. With HDR you wouldnt need to do that as your eyes will do that in real life as the display immediately goes from dim dark scene to ..as you say,, 'eye searing' brightness. HDR will likely never get dangerously bright, uncomfortably bright sure ..but that would be the entire point of those scenes.

 

SDR is the opposite, if you dont want 'eye searing' what ur looking for is SDR, a standard range from brightest part to darkest part.

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/30/2022 at 8:04 AM, SolarNova said:

As i explained, flickering the backlight , which is what ELMB is, reduces brightness. If brightness is reduced you cant get HDR, High Dynamic range is the difference between the brightest part and darkest part of an image, if brightness cant get ...bright.., u cant have HDR.

Standard movie theater HDR content is at 48-100 nits. (https://www.projectorcentral.com/HDR-From-A-to-Z.htm?em#:~:text=In conventional commercial cinemas with,a much lower black level.) The reduction in brightness isn't that much of a deal in a light controlled environment. (which I have, it's called "a basement with blackout curtains") Unless they have the worst VA panels ever made, (specifically looking at the XG32VC) this shouldn't be a problem for an HDR400 capable monitor.

 

My current monitor can already do backlight strobing and HDR combined, but is only 1080p. (and is MUCH older)

On 12/30/2022 at 8:04 AM, SolarNova said:

By definition, HDR is more or less 'eye searing'

HDR is contrast, not brightness. Eye searing ignores the contrast in favor of being as bright as physically possible, which is what the 1000 nit screens look like to me. They always seem to have really bad black levels. (excluding OLED)

On 12/30/2022 at 8:04 AM, SolarNova said:

SDR is the opposite, if you dont want 'eye searing' what ur looking for is SDR, a standard range from brightest part to darkest part.

SDR lacks the color range that I'm looking for. I want the 10-bit color that is locked into an HDR mode. There is no logical reason that HDR or 10-bit color needs to be locked to only be available above certain brightness levels.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, BTGbullseye said:

The highlighted part of this article with 48-100 nits is still talking about SDR, not HDR.

 

Quote

In conventional commercial cinemas with professional projectors, the standardized peak brightness of the image is 48 nits. (Dolby Cinema has a peak brightness of about 100 nits and a much lower black level.) So, a conventional commercial cinema presents only about half the dynamic range of an SDR image at home.

Nowadays we're at a point where the TVs at home can deliver significantly better image quality than theater projectors. So why still use cinema as the baseline?

 

15 hours ago, BTGbullseye said:

The reduction in brightness isn't that much of a deal in a light controlled environment. (which I have, it's called "a basement with blackout curtains") Unless they have the worst VA panels ever made, (specifically looking at the XG32VC) this shouldn't be a problem for an HDR400 capable monitor.

In most cases using ELMB will reduce brightness by 50%. So most HDR400 monitors will end up around 200 nits, which is well below the baseline of what HDR needs to have any significant effect.

 

15 hours ago, BTGbullseye said:

My current monitor can already do backlight strobing and HDR combined, but is only 1080p. (and is MUCH older)

HDR is contrast, not brightness. Eye searing ignores the contrast in favor of being as bright as physically possible, which is what the 1000 nit screens look like to me. They always seem to have really bad black levels. (excluding OLED)

But if your monitor can only do 100-200 nits while in HDR, you don't get significantly better contrast than just using SDR. You're misunderstanding what HDR is about on a basic level. It's not that most content is driven to 1000 nits. It's that specular highlights (like reflections, the sun, etc) can reach high brightness levels to mimic a more real scenario. The content overall is still roughly mastered around 100 nits, with only highlights going significantly over that. That's the reason why some people who use their SDR monitors at 300 nits or so find that HDR is actually too dim for their taste.

 

SDR brightness is relative. That means the content's brightness is entirely dependant on your monitor's brightness and what you're setting it to.

So a light bulb that was mastered in SDR to hit 100 nits can be 300 nits on your monitor depending on your settings or monitor's capabilities.

 

HDR brightness is absolute. That means the brightness is defined in mastering and baked into the content.

A 100 nit light bulb will be 100 nits, no matter if the monitor is capable of 100 nits or 1000 nits.

 

15 hours ago, BTGbullseye said:

SDR lacks the color range that I'm looking for. I want the 10-bit color that is locked into an HDR mode. There is no logical reason that HDR or 10-bit color needs to be locked to only be available above certain brightness levels.

Just use 8 bit. Modern GPUs force FRC when using 8 bit which results in better gradient handling than native 10 bit on most monitors. Higher bit depth doesn't actually result in more saturated colors. The only thing it impacts is gradient handling.

 

All in all it seems you want HDR just because of the color information, but you don't understand that there are drawbacks of using HDR on a low brightness display. Using ELMB to limit peak brightness is not worth it.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/4/2023 at 3:20 AM, Stahlmann said:

Just use 8 bit. Modern GPUs force FRC when using 8 bit which results in better gradient handling than native 10 bit on most monitors. Higher bit depth doesn't actually result in more saturated colors. The only thing it impacts is gradient handling.

My GPU and monitor both support native 10-bit color (the panel is 8-bit FRC) and is significantly better than any 8-bit native I have ever even heard of. (also, saturation is not what I'm after, since HDR invariably is less saturated than SDR unless you crank the saturation)

On 1/4/2023 at 3:20 AM, Stahlmann said:

All in all it seems you want HDR just because of the color information

That is correct. There are reasons for this that would not apply to the average person, namely that I have abnormally high color vision capability, especially in lower illumination levels.

On 1/4/2023 at 3:20 AM, Stahlmann said:

but you don't understand that there are drawbacks of using HDR on a low brightness display.

I fully understand the drawbacks. The problem is when a company refuses to allow me to decide to do the inadvisable for arbitrary reasons. If there was a technical reason for it, I could understand, but everything everyone is saying indicates that there is no technical reason for it.

On 1/4/2023 at 3:20 AM, Stahlmann said:

Using ELMB to limit peak brightness is not worth it.

Who are you to make that decision for me?

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, BTGbullseye said:

HDR invariably is less saturated than SDR

Hi. your understanding of HDR is basically incorrect. 

 

  1. I agree that contrast is important in HDR for deeper blacks. However brightness is important key point for HDR.
  2. Your current monitor is only HDR 400 certified, which is a well-known too minimum requirement that cause HDR content to be much dull than SDR; this is why you feel HDR is invariably less saturated than SDR. However with a proper device HDR 600 and above, recommended HDR 1000, you will notice HDR is more vibrant and has a deeper color than SDR. (HDR 400 and HDR True Black 400 for OLED is two different standard and different output)
  3. ELMB normally divide the maximum brightness to half, which is why not many monitor currently capable. Take for example HDR 400/2 is only 200cd/m² brightness, HDR 600/2 is only 300cd/m² brightness,  even HDR 1000/2 is only 500cd/m² brightness. After applying ELMB, all brightness for HDR output will be insufficient.

PC: AMD Ryzen 9 5900X, Gigabyte GeForce RTX 4090 OC 24G, X570 AORUS Elite WIFI Motherboard, HyperX FURY 32GB DDR4-3200 RGB RAM, Creative Sound Blaster AE-9 Sound Card, Samsung 970 Evo Plus M.2 SATA 500GB, ADATA XPG SX8200 Pro M.2 SATA 2TB, Asus HyperX Fury RGB SSD 960GB, Seagate Barracuda 7200RPM 3.5 HDD 2TB, Cooler Master MASTERLIQUID ML240R ARGB, Cooler Master MASTERFAN MF120R ARGB, Cooler Master ELV8 Graphics Card Holder ARGB, Asus ROG Strix 1000G PSU, Lian Li LANCOOL II MESH RGB Case, Windows 11 Pro (22H2).


Laptop: Asus Vivobook "A Bathing Ape" - ASUS Vivobook S 15 OLED BAPE Edition: Intel i9-13900H, 16 GB RAM, 15.6" 2.8K 120hz OLED | Apple MacBook Pro 14" 2023: M2 Pro, 16 GB RAM, NVMe 512 GB | Asus VivoBook 15 OLED: Intel® Core™ i3-1125G4, Intel UHD, 8 GB RAM, Micron NVMe 512 GB | Illegear Z5 SKYLAKE: Intel Core i7-6700HQ, Nvidia Geforce GTX 970M, 16 GB RAM, ADATA SU800 M.2 SATA 512GB.

 

Monitor: Samsung Odyssey OLED G9 49" 5120x1440 240hz QD-OLED HDR, LG OLED Flex 42LX3QPSA 41.5" 3840x2160 bendable 120hz WOLED, AOC 24G2SP 24" 1920x1080 165hz SDR, LG UltraGear Gaming Monitor 34" 34GN850 3440x1440 144hz (160hz OC) NanoIPS HDR, LG Ultrawide Gaming Monitor 34" 34UC79G 2560x1080 144hz IPS SDR, LG 24MK600 24" 1920x1080 75hz Freesync IPS SDR, BenQ EW2440ZH 24" 1920x1080 75hz VA SDR.


Input Device: Asus ROG Azoth Wireless Mechanical KeyboardAsus ROG Chakram X Origin Wireless MouseLogitech G913 Lightspeed Wireless RGB Mechanical Gaming Keyboard, Logitech G502X Wireless Mouse, Logitech G903 Lightspeed HERO Wireless Gaming Mouse, Logitech Pro X, Logitech MX Keys, Logitech MX Master 3, XBOX Wireless Controller Covert Forces Edition, Corsair K70 RAPIDFIRE Mechanical Gaming Keyboard, Corsair Dark Core RGB Pro SE Wireless Gaming Mouse, Logitech MK850 Wireless Keyboard & Mouse Combos.


Entertainment: LG 55" C9 OLED HDR Smart UHD TV with AI ThinQ®, 65" Samsung AU7000 4K UHD Smart TV, SONOS Beam (Gen 2) Dolby Atmos Soundbar, SONOS Sub Mini, SONOS Era 100 x2, SONOS Era 300 Dolby Atmos, Logitech G560 2.1 USB & Bluetooth Speaker, Logitech Z625 2.1 THX Speaker, Edifier M1370BT 2.1 Bluetooth Speaker, LG SK9Y 5.1.2 channel Dolby Atmos, Hi-Res Audio SoundBar, Sony MDR-Z1R, Bang & Olufsen Beoplay EX, Sony WF-1000XM5, Sony WH-1000XM5, Sony WH-1000XM4, Apple AirPods Pro, Samsung Galaxy Buds2, Nvidia Shield TV Pro (2019 edition), Apple TV 4K (2017 & 2021 Edition), Chromecast with Google TV, Sony UBP-X700 UltraHD Blu-ray, Panasonic DMP-UB400 UltraHD Blu-ray.

 

Mobile & Smart Watch: Apple iPhone 15 Pro Max (Natural Titanium), Apple Watch Series 8 Stainless Steel with Milanese Loop (Graphite).

 

Others Gadgets: Asus SBW-06D2X-U Blu-ray RW Drive, 70 TB Ext. HDD, j5create JVCU100 USB HD Webcam with 360° rotation, ZTE UONU F620, Maxis Fibre WiFi 6 Router, Fantech MPR800 Soft Cloth RGB Gaming Mousepad, Fantech Headset Headphone Stand AC3001S RGB Lighting Base Tower, Infiniteracer RGB Gaming Chair

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, BTGbullseye said:

That is correct. There are reasons for this that would not apply to the average person, namely that I have abnormally high color vision capability, especially in lower illumination levels.

 

Who are you to make that decision for me?

I was merely stating that the features work towards different goals, so using both at the same time kinda defeats the purpose of one.

If there is a medical reason for it then i obviously lack the understanding of your situation to make a call on that.

 

12 hours ago, BTGbullseye said:

My GPU and monitor both support native 10-bit color (the panel is 8-bit FRC) and is significantly better than any 8-bit native I have ever even heard of.

Just try it. I know common sense says you should use 10 bit if you have the option to. Higher is better after all right? But so far every monitor i tested after finding it has shown to handle gradients better with 8 bit enabled and letting my 3080 do the dithering. The resulting 8 bit + FRC is better than my NATIVE 10 bit setting on my C2 OLED. (it is an actual real 10 bit panel)

 

12 hours ago, BTGbullseye said:

 since HDR invariably is less saturated than SDR unless you crank the saturation

That is fundamentally wrong. SDR uses the sRGB or Rec.709 color gamuts, which do not extend as far as D65-P3 or Rec.2020 which are the color gamuts used for HDR. SDR is only more saturated if your monitor uses the sRGB saturation values but displays in a wider color gamut. It's called oversaturation and is misbehaviour of a display, nothing else.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Stahlmann said:

Just try it.

I have... That's why I'm going so far as to reduce the refresh rate from 170Hz to 120Hz just to get it.

8 hours ago, Stahlmann said:

That is fundamentally wrong. SDR uses the sRGB or Rec.709 color gamuts, which do not extend as far as D65-P3 or Rec.2020 which are the color gamuts used for HDR. SDR is only more saturated if your monitor uses the sRGB saturation values but displays in a wider color gamut. It's called oversaturation and is misbehaviour of a display, nothing else.

SDR content on an HDR display is always under saturated. HDR content on SDR displays is always oversaturated. My statement stands.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, BTGbullseye said:

SDR content on an HDR display is always under saturated. HDR content on SDR displays is always oversaturated. My statement stands.

One could argue that's not a technical issue, but rather mostly badly optimized monitors. A decent monitor where the manufacturer actually put some time into calibration won't be over or undersaturated no matter what dynamic range setting you're at. But sadly, not even most high-end monitors have that kind of attention to detail put into them.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/12/2023 at 10:41 AM, Stahlmann said:

One could argue that's not a technical issue, but rather mostly badly optimized monitors. A decent monitor where the manufacturer actually put some time into calibration won't be over or undersaturated no matter what dynamic range setting you're at. But sadly, not even most high-end monitors have that kind of attention to detail put into them.

Agreed, and that's just another point where manufacturers are doing stupid shit with their monitors.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

On 1/13/2023 at 12:37 AM, BTGbullseye said:

I have... That's why I'm going so far as to reduce the refresh rate from 170Hz to 120Hz just to get it.

SDR content on an HDR display is always under saturated. HDR content on SDR displays is always oversaturated. My statement stands.

Eh it's actually not. Windows, especially Windows 11 will automatically switch between SDR and HDR for you, if you enable HDR. 

 

Also, if all you want is to display the correct wild colour gamut content, you can always just manually switch the colour space you want from your display. If it has decent colour coverage, it should allow you to select the DCIP-3 directly from its menu.

 

That and you've said you don't care about BFI (otherwise, what Asus called ELMB) then why don't just disable it? BFI on LCD tends to be horrible anyway.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 1/19/2023 at 10:59 PM, e22big said:

Eh it's actually not. Windows, especially Windows 11 will automatically switch between SDR and HDR for you, if you enable HDR.

I don't use Windows 11, and Windows 10 doesn't auto-switch at all unless a program is in exclusive fullscreen.

On 1/19/2023 at 10:59 PM, e22big said:

Also, if all you want is to display the correct wild colour gamut content, you can always just manually switch the colour space you want from your display. If it has decent colour coverage, it should allow you to select the DCIP-3 directly from its menu.

I have already done that, and that really doesn't relate to my issue at all.

On 1/19/2023 at 10:59 PM, e22big said:

That and you've said you don't care about BFI (otherwise, what Asus called ELMB) then why don't just disable it? BFI on LCD tends to be horrible anyway.

That is completely wrong. I wanted to be able to use ELMB with HDR, and didn't care about variable refresh.

CPURyzen 7 5800X Cooler: Arctic Liquid Freezer II 120mm AIO with push-pull Arctic P12 PWM fans RAM: G.Skill Ripjaws V 4x8GB 3600 16-16-16-30

MotherboardASRock X570M Pro4 GPUASRock RX 5700 XT Reference with Eiswolf GPX-Pro 240 AIO Case: Antec P5 PSU: Rosewill Capstone 750M

Monitor: ASUS ROG Strix XG32VC Case Fans: 2x Arctic P12 PWM Storage: HP EX950 1TB NVMe, Mushkin Pilot-E 1TB NVMe, 2x Constellation ES 2TB in RAID1

https://hwbot.org/submission/4497882_btgbullseye_gpupi_v3.3___32b_radeon_rx_5700_xt_13min_37sec_848ms

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×