Jump to content

4K 60FPS = 8K 30FPS?

zzzzzzzhhh

Is this a stupid question? I just wanted to know if a card doing 60fps 4K can do 30fps 8K gaming, and if that's not how it works, how does it work?

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Vespertine said:

Is this a stupid question? I just wanted to know if a card doing 60fps 4K can do 30fps 8K gaming, and if that's not how it works, how does it work?

Thats not how that works.  8K is 4 times more pixels than 4K which is 4 times more than 1080p.

 

so a 4K60FPS capable GPU would be doing about 15 FPS at 8K

Linux Daily Driver:

CPU: R5 2400G

Motherboard: MSI B350M Mortar

RAM: 32GB Corsair Vengeance LPX DDR4

HDD: 1TB POS HDD from an old Dell

SSD: 256GB WD Black NVMe M.2

Case: Phanteks Mini XL DS

PSU: 1200W Corsair HX1200

 

Gaming Rig:

CPU: i7 6700K @ 4.4GHz

Motherboard: Gigabyte Z270-N Wi-Fi ITX

RAM: 16GB Corsair Vengeance LPX DDR4

GPU: Asus Turbo GTX 1070 @ 2GHz

HDD: 3TB Toshiba something or other

SSD: 512GB WD Black NVMe M.2

Case: Shared with Daily - Phanteks Mini XL DS

PSU: Shared with Daily - 1200W Corsair HX1200

 

Server

CPU: Ryzen7 1700

Motherboard: MSI X370 SLI Plus

RAM: 8GB Corsair Vengeance LPX DDR4

GPU: Nvidia GT 710

HDD: 1X 10TB Seagate ironwolf NAS Drive.  4X 3TB WD Red NAS Drive.

SSD: Adata 128GB

Case: NZXT Source 210 (white)

PSU: EVGA 650 G2 80Plus Gold

Link to comment
Share on other sites

Link to post
Share on other sites

That's not how it works. 8K is 4x as many pixels as 4K (33,177,600 vs 8,294,400) so the bandwidth required for 4K 60Hz is the same as for 8K 15Hz, not 30Hz. 

 

In terms of connection standards, 4K 60Hz can be done with DisplayPort 1.2 and HDMI 2.0 but 8K 30Hz cannot be done on either specification AFAIK. It would require DP 1.3/4 or HDMI 2.1 support, neither of which is certain if the card supports the previous standard. So, while something like a GTX 1080Ti, which should be DP1.3/4 compliant, may support 8K 60Hz at some point, just having current support for DP 1.2 does not mean it will support later versions. As for HDMI, it is a similar situation, but AFAIK there are no cards with controllers that are HDMI 2.1 compliant right now. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, MedievalMatt said:

Thats not how that works.  8K is 4 times more pixels than 4K which is 4 times more than 1080p.

 

so a 4K60FPS capable GPU would be doing about 15 FPS at 8K

 

2 minutes ago, Comic_Sans_MS said:

8K is 4 times the pixels as 4K

Therefore, 8K 15FPS would be the same amount of pixels as 4k 60Hz.

Oh, I forgot about that, thanks.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Vespertine said:

 

Oh, I forgot about that, thanks.

 

In theory (this is also not how this works in reality)  a 4K60FPS GPU, used in SLI at 8K could to 30FPS.  but that also implies that same SLI setup would to 120FPS at 4K.

Linux Daily Driver:

CPU: R5 2400G

Motherboard: MSI B350M Mortar

RAM: 32GB Corsair Vengeance LPX DDR4

HDD: 1TB POS HDD from an old Dell

SSD: 256GB WD Black NVMe M.2

Case: Phanteks Mini XL DS

PSU: 1200W Corsair HX1200

 

Gaming Rig:

CPU: i7 6700K @ 4.4GHz

Motherboard: Gigabyte Z270-N Wi-Fi ITX

RAM: 16GB Corsair Vengeance LPX DDR4

GPU: Asus Turbo GTX 1070 @ 2GHz

HDD: 3TB Toshiba something or other

SSD: 512GB WD Black NVMe M.2

Case: Shared with Daily - Phanteks Mini XL DS

PSU: Shared with Daily - 1200W Corsair HX1200

 

Server

CPU: Ryzen7 1700

Motherboard: MSI X370 SLI Plus

RAM: 8GB Corsair Vengeance LPX DDR4

GPU: Nvidia GT 710

HDD: 1X 10TB Seagate ironwolf NAS Drive.  4X 3TB WD Red NAS Drive.

SSD: Adata 128GB

Case: NZXT Source 210 (white)

PSU: EVGA 650 G2 80Plus Gold

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Sampsy said:

I also suspect that performance doesn't scale so linearly, especially if you happen to run out of VRAM rendering at 8k. 

If 8GB of VRAM is fine for 4K, 16GB with VEGA is fine for 8K right?

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Vespertine said:

Is this a stupid question? I just wanted to know if a card doing 60fps 4K can do 30fps 8K gaming, and if that's not how it works, how does it work?

 

Not quite. 8K is 4x the resolution of 4K, so being able to run a game at 4K 60fps would translate to roughly 15fps at 8K.

Specs: CPU - Intel i7 8700K @ 5GHz | GPU - Gigabyte GTX 970 G1 Gaming | Motherboard - ASUS Strix Z370-G WIFI AC | RAM - XPG Gammix DDR4-3000MHz 32GB (2x16GB) | Main Drive - Samsung 850 Evo 500GB M.2 | Other Drives - 7TB/3 Drives | CPU Cooler - Corsair H100i Pro | Case - Fractal Design Define C Mini TG | Power Supply - EVGA G3 850W

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Vespertine said:

If 8GB of VRAM is fine for 4K, 16GB with VEGA is fine for 8K right?

Again, not really.  Remember, it's 4× not double so if 8gb is the minimum for 4k 32 is the min for 8k.

Want to custom loop?  Ask me more if you are curious

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Damascus said:

Again, not really.  Remember, it's 4× not double so if 8gb is the minimum for 4k 32 is the min for 8k.

Forgot again, ha. Though I wouldn't say 8GB is the minimum, more like recommended.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Vespertine said:

Forgot again, ha.

Honestly vega 16gb should do best out of the current offerings of mainstream gpu's anyways.

Want to custom loop?  Ask me more if you are curious

 

Link to comment
Share on other sites

Link to post
Share on other sites

Here's how the math works out: 

4k: 3840 * 2160 = 8 294 400 pixels.

8k: 7680×4320 = 33 177 600

8 294 400 / 33 177 600 = 0.25

8k has four times the pixels of 4k. So if we assume perfect scaling then 4k 60fps = 8k 15fps.

Link to comment
Share on other sites

Link to post
Share on other sites

The "8K is 4x 4K" has been beaten to death already so I'd just like to raise awareness of this too:

1 hour ago, Sampsy said:

I also suspect that performance doesn't scale so linearly, especially if you happen to run out of VRAM rendering at 8k. 

since yeah, an increase in framerate might not perfectly match a reduction in resolution, or vise versa.  In fact I know this for a fact since if you compare, for example, an R9 Fury to a similar card at 1080p, and then compare to that same card at 4K, they pull away from each other (well depending what you pick for the other card but yeah)

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Oshino Shinobu said:

In terms of connection standards, 4K 60Hz can be done with DisplayPort 1.2 and HDMI 2.0 but 8K 30Hz cannot be done on either specification AFAIK. It would require DP 1.3/4 or HDMI 2.1 support, neither of which is certain if the card supports the previous standard. So, while something like a GTX 1080Ti, which should be DP1.3/4 compliant, may support 8K 60Hz at some point, just having current support for DP 1.2 does not mean it will support later versions. As for HDMI, it is a similar situation, but AFAIK there are no cards with controllers that are HDMI 2.1 compliant right now. 

Current DP 1.4 cards are capable of 8k@30Hz output. GT1030 is capable of it so I believe that any other Pascal card is capable of the same.

I am not talking about gaming here but about video output only. In case of using DSC compression 8k@60Hz with HDR (as well as 4k@120Hz) is possible via DP 1.4, however, you need DSC decoder in your monitor (still not commercially available). More about DSC here http://www.vesa.org/wp-content/uploads/2014/04/VESA_DSC-ETP200.pdf

DSC is part of VESA standard (as well as DisplayPort) so this is not some proprietary mumbo jumbo. Unless Nvidia "reinvents hot water" with something of their own as they usually like to do.

 

Asus has already utilized this technology, so more about it here https://www.kotaku.com.au/2016/06/asus-will-apparently-have-the-worlds-first-144hz-4k-monitor/

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Damascus said:

Again, not really.  Remember, it's 4× not double so if 8gb is the minimum for 4k 32 is the min for 8k.

However if we look at current games, if a game utilises 4GB~ of vRAM @1080p, typically @4K it wouldn't use more than 6-7GB of vRAM which is less than double so...vRAM doesn't really scale linearly to resolution either :D 

 

However, we could probably expect the vRAM requirements to double for 8K if this tweaktown article has anything to do bout it...

http://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html

Looking at my signature are we now? Well too bad there's nothing here...

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

What? As I said, there seriously is nothing here :) 

Link to comment
Share on other sites

Link to post
Share on other sites

Linus said this on a NCIX tech tips video, he ran Shadows of Mordor on super sampling 4K, effectively 8K, he ran out of VRAM and the FPS tanked well below 15FPS. If you have the VRAM you'd probably get a little more FPS than 15 due to pixel engines optimisations and such, maybe 16 FPS, but if you don't you'd be on like 4 or 5 FPS

Yours faithfully

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×