Jump to content

GAMING at 16K RESOLUTION?? - HOLY $H!T

Just now, komar said:

 

WHY P5000s??

was it for the lols - 16K 16 Gigs VRAM or just big brother nvdia just said f*ck you, this is what youi get????

If it was just for the lolz im very dissapointed - from the last video it was obvious that 8 or 10 or whatever the XPs have aint enough for 8k so how did you expect it to run good games with 16 gigs on 16k????

I think it was for their ability to output 4 x DisplayPort 1.4 (4 x 4K) and for their ability to sync their outputs (feature for monitor walls), so the image won't be jello like (due to one card refreshing the image 1ms faster than the other card for example)

Lower end cards may only handle 3 4 K outputs at a time without some tricks like daisychaining displays and then you'd still have the sync issue.

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Tequila628 said:

Thank you, man! I'm one of those who's on G9 since 2007, then on G9x (Wide grip, Fingertip) till this day while constantly looking for the replacement in the last years. Bought a g303 a year ago and just couldn't get used to it fingertipping. I beat DOOM on Ultra-Nightmare with the G9x but I can clearly see how worse is the sensor in it in comparison with the g303's sensor.

 

20 hours ago, tecggst said:

 

I use the G403 and love it. It has the same sensor as the G903 and a lot of the same base technology and performace. The difference is mainly in build quality and aesthetics (like some fancy weight saving features, additional swap-able buttons,infinity scrolling, etc).

 

If you can, I'd wait to see how the G703 ends up. It's basically the same mouse, with slightly better switches and logitech's new wireless charging tech (that is if you are willing to pay the premium for it).

I have 2 G403's at home and use one at work. If it fits your hand its hands down the best mouse. Love that thing.

widget.png?style=banner2

PC: 13900K, 32GB Trident Z5, AORUS 7900 XTX, 2TB SN850X, 1TB MP600, Win 11

NAS: Xeon W-2195, 64GB ECC, 180TB Storage, 1660 Ti, TrueNAS Scale

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, WMGroomAK said:

Well,you guys obviously impressed the folks over at Ars Technica...  

 

https://arstechnica.com/gaming/2017/08/what-does-it-take-to-power-gaming-at-16k-resolution/

 

So, who's office is this system being moved into???

The system is currently being used by @AlexTheGreatish as a solidworks rig. :P

widget.png?style=banner2

PC: 13900K, 32GB Trident Z5, AORUS 7900 XTX, 2TB SN850X, 1TB MP600, Win 11

NAS: Xeon W-2195, 64GB ECC, 180TB Storage, 1660 Ti, TrueNAS Scale

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, jakkuh_t said:

 

I have 2 G403's at home and use one at work. If it fits your hand its hands down the best mouse. Love that thing.

 

I initially bought the G900 but ended up not loving its feel in the hand, the G403 fit my grip style much better.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, jakkuh_t said:

The system is currently being used by @AlexTheGreatish as a solidworks rig. :P

 

Wait, someone is actually using the rig? I had thought it would be taken apart and the monitors returned to Acer and the GPUs to Nvidia

 

@AlexTheGreatish how do you like the multi display for solidworks? I use a two screen setup at work (I use solidworks for a lot of my projects. one screen for solidworks, the other for emails,drawings,analysis, excel, etc), and have often thought about requesting an upgrade for more screens  but always worried the bezels would get in the way.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, tecggst said:

 

Wait, someone is actually using the rig? I had thought it would be taken apart and the monitors returned to Acer and the GPUs to Nvidia

 

@AlexTheGreatish how do you like the multi display for solidworks? I use a two screen setup at work (I use solidworks for a lot of my projects. one screen for solidworks, the other for emails,drawings,analysis, excel, etc), and have often thought about requesting an upgrade for more screens  but always worried the bezels would get in the way.

Oh I only took the computer bit for the 4 GPUs haha, using a single 240Hz panel to snipe the mouse gestures

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, AlexTheGreatish said:

Oh I only took the computer bit for the 4 GPUs haha, using a single 240Hz panel to snipe the mouse gestures

 darn it!

 

Enjoy those quadros, we use consumer level cards at work and I can work with massive assemblies with thousands of parts no problem; I can only imagine what sort of power those quadros bring to the table.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, tecggst said:

 darn it!

 

Enjoy those quadros, we use consumer level cards at work and I can work with massive assemblies with thousands of parts no problem; I can only imagine what sort of power those quadros bring to the table.

For work in SW it doesn't make a huge difference besides rotating models being a lot smoother.  Also it's pretty nice for recording what I'm making since I can dedicate a whole GPU to it lol

Link to comment
Share on other sites

Link to post
Share on other sites

if he had used 8k monitors, could he possibly have done 32k, obviously you'd need like 8 quadro gpus and some god level CPU, but could he?

Link to comment
Share on other sites

Link to post
Share on other sites

I want to see a HD 4890 running Half Life 1 in 16K.

MAD-BOX Ryzen 1600X - ASRock X370 Killer SLI - Sapphire R9 Fury NITRO+  -Fried it... RIP

Xeon e5640 4.35ghz, CoolerMaster Seidon 240V, ASUS P6X58D-E, DDR3 8GB 1636mhz CL9, Sapphire Fury Nitro OC+, 2x Stone age storage @ 7200RPM, Crucial 960GB SSD, NZXT S340, Silverstone Strider Gold Evolution, Steelseries RIVAL, Mechanical Metal keyboard, Boogie Bug Aimb mouse pad.

Link to comment
Share on other sites

Link to post
Share on other sites

Might be a dumb question, but if a 16K monitor was made, would this kind of power be required to game at 16K?  Or is it so demanding because all the monitors are linked together

 

CPU: Intel - Core i7-8700k
CPU Cooler: be quiet! - Dark Rock Pro 4 50.5 CFM CPU Cooler
Motherboard: MSI - MPG Z390 GAMING PRO CARBON
Memory: G.Skill - Trident Z RGB 16 GB (2 x 8 GB) DDR4-3200 Memory
Storage: Crucial P1 1tb M.2 NVME BOOT DRIVE
Video Card: MSI RTX 2070 Super
Case: NZXT - H500 (White) 
Power Supply: Corsair - RMx (2018) 550 W 80+ Gold 
 

Link to comment
Share on other sites

Link to post
Share on other sites

I wonder if a threadripper with it's 64 pcie lanes would preform better.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, komar said:

WHY P5000s??

In my experience getting P6000s for testing is super difficult. When we asked GPU manufacturer to lend us a single P6000 we had to wait almost 2 months and had to return it within 10 days. Buying 4 of those is ~24k US money and that is apparently only way of procuring 4 of them at the same time. Getting them for free, even for limited time would be straight impossible in my opinion.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, nicolas2465 said:

Might be a dumb question, but if a 16K monitor was made, would this kind of power be required to game at 16K?  Or is it so demanding because all the monitors are linked together

With 16 x 1920x1080 monitors, you'd have 15360 x 8640 resolution. At 10bit per color per pixel (required for HDR, Rec2020, bluray utra-hd), you're then looking at 

 

15360 x 8640 x 3 colors per pixel x 10 bits pe pixel = 3'981'312'000 or 4 gbps per frame (~ 500MB)

 

So for 60 fps you're looking at 240 gbps of raw data.

In order to send the data through cables it has to be encoded in 8:10 , meaning for every 8 bits, 2 more bits are sent on cable, so your 240 gbps become  240 * 10 / 8 = 300 gbps

 

For comparison HDMI 2.0 can do 18 gbps, HDMI 2.1 can do 48 gbps and Displayport 1.4 can do up to around 49 gbps.  So you'd need around 8 cables for your monitor with today's tech.

There is DSC real time compression for 8K and 4K 120Hz and higher, which cuts the bandwitdh to half with extremely small quality loss

 

With YCbCr 4:2:0 and 8bit per pixel color you could probably do 16k 60fps at around 100 gbps and therefore do it with 4 displayport cables and it would be fine for regular bluray movies and games and some apps but would suck for HDR and photoshop like or video editing software

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, mariushm said:

I think it was for their ability to output 4 x DisplayPort 1.4 (4 x 4K) and for their ability to sync their outputs (feature for monitor walls), so the image won't be jello like (due to one card refreshing the image 1ms faster than the other card for example)

Lower end cards may only handle 3 4 K outputs at a time without some tricks like daisychaining displays and then you'd still have the sync issue.

 

you missed me  - I didnt ask why quadros, I was asking why the older quadros with 16 instead of 24 gig VRM

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, Cool_Guy81 said:

I wonder if a threadripper with it's 64 pcie lanes would preform better.

I'm somehow doubtful of that as even with TR having 64 PCI-e lanes, 4 of them are dedicated to the chipset so you would still not be able to get all the GPUs running at x16...  At the same time, I don't think there is significant gains from running a GPU at x16 over x8 so the 6950X should still be sufficient. A better question might be running a high end Xeon or Epyc, however, then you lose the clock performance.

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, komar said:

you missed me  - I didnt ask why quadros, I was asking why the older quadros with 16 instead of 24 gig VRM

 

Linus mentioned that that's why Nvidia was willing to lend them

Link to comment
Share on other sites

Link to post
Share on other sites

You guys should post screenshots AND absolutely try Elite Dangerous with it. It has a special screenshot feature witch duble the resolution of the screen used, so you will end wit a screenshot of something like 30720 x 17280 ... !!

Link to comment
Share on other sites

Link to post
Share on other sites

On 03/08/2017 at 6:26 PM, laking-gamer said:

if he had used 8k monitors, could he possibly have done 32k, obviously you'd need like 8 quadro gpus and some god level CPU, but could he?

According to nVidia, yes. That would near $120k at current off-the-shelf prices.

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/2/2017 at 3:27 PM, jakkuh_t said:

We wanted to "1 up" the 8K video in every way we would. I mean 4, 8K displays would have been too easy.

yeah but it would have looked better

Link to comment
Share on other sites

Link to post
Share on other sites

I am overall suprised that you can even game at 16K resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

I've watched this one a few times now. This is honestly one of the coolest things I've ever seen.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×