Jump to content

New display cable created makes 8K Resolution @ 120Hz possible

BiG StroOnZ

Great. Now Explain that to Nvidia and AMD, they seam to think it requires some fancy separate circuit board, and cost over 100$, and only works with some select expensive cards

fig07.jpg

And that is why when you have multiple screen setup on the same card, they are not in sync either. A problem with multiple screen gaming.

But I guess all Nvidia and AMD both needed is a university student in engineering.

They teach you exactly how to do this in any good Operating Systems class for low-level rapid context switching. Hell Intel and AMD both have circuitry in their CPUs which handles such an algorithm on a hardware level for even better performance.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

They teach you exactly how to do this in any good Operating Systems class for low-level rapid context switching. Hell Intel and AMD both have circuitry in their CPUs which handles such an algorithm on a hardware level for even better performance.

Yea I know, I did the freaking class, but sadly doing this causes performance drops. They use different techniques in real life so that it doesn't goes "Wait.. let me turn my thumbs and do nothing for a while while I wait for my results to come in and I can work with". Stalls hurts performance, no mater what you do. You can't remove them all, but you can try and diminish it.

In this case, you want the GPU to rendering all 6 segments, HOLD until the last one is done, BUT have the GPU render the following frame by prediction, or better yet, help another GPU to finish the work (ie: internal SLI, if you will), and output the frame to teh 6 lanes. The monitor needs to process the signal (6 controller for the 6 segments) to display on the panel, so it needs to be in sync. The video cable needs the sync lines to work with the video output controller of the GPU to be in sync to display all the frames. Alternatively the monitor. can have a multi-core controller with sync lanes, to make the process simpler. But regardless, you want sync.

And again, if it's was so simple as it was thought in class, why do we need this expensive boards let alone why is the board not just some stupid chip and everything blank, why it need 2 Ethernet cable and coaxial, why not a thin simple wire? Why a GPU, our GeForce and Radeon can't display games on multiple monitor in sync, despite being on the same graphics card. If it's so simple, why no one is implemented it, why you need complex hardware, why you need expensive stuff, why is on select hardware, why does this and DisplayPort doesn't have it? Why does affected monitors have no system to try and sync things?

Link to comment
Share on other sites

Link to post
Share on other sites

Yea I know, I did the freaking class, but sadly doing this causes performance drops. They use different techniques in real life so that it doesn't goes "Wait.. let me turn my thumbs and do nothing for a while while I wait for my results to come in and I can work with".

AMD and Nvidia can go suck dirt. Intel, as an investor I command thee: go fix this problem and make a killing doing it!

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

8K/120hz is completely pointless right now so why even making this cable?
For games we'll need at least 10x the power to make 8k even playable with current games.
And we don't even have any storage devices that can handle those massive files not even the biggest HDDs would make it practical to use with an 8k movie being easy 500GB if not more.
Maybe in 10-15 years 8K will be usable but by then this cable will be extremely outdated.

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

No point of 120Hz because it's 4:2:0, If it can do 8K 60Hz with 4:4:4 then it's awesome.

 

Since 4:2:0 eats half of the bandwidth that 4:4:4 does...  you guessed it!   This cable does 4:4:4, 8k @ 60hz.

...oh and its still 36bit color depth for both.

 

I just calculated - this cable can do 5120x2160, 4:4:4,  full 10bit color (same as 30bit) @ 200hz. And you still have a 9% of room left on the cable ^^

Link to comment
Share on other sites

Link to post
Share on other sites

AMD and Nvidia can go suck dirt. Intel, as an investor I command thee: go fix this problem and make a killing doing it!

Yea exactly.

Multi-stream is eventually coming. But we might change to optical before that. But no one wants PentiumD. And once again, it doesn't change the fact that this has no sync. it has no wires for any sync ability, there it can't just be magically added, and it doesn't change the fact that if you use a 5K monitor today, you'll get tearing on anything that moves, hence everyone is avoiding like the plague the early 4K monitors as they are using multi-stream DisplayPort. At work, we play a lot with high resolution monitors, we have 4K laser projectors, and the latest GeForce and Quadros (we are using a lot of of Nvidia libraries and also CUDA), and we have a 4K MST monitor.. it's bad. It's not even funny. When you move a window you can see the tearing.

Link to comment
Share on other sites

Link to post
Share on other sites

8K/120hz is completely pointless right now so why even making this cable?

For games we'll need at least 10x the power to make 8k even playable with current games.

And we don't even have any storage devices that can handle those massive files not even the biggest HDDs would make it practical to use with an 8k movie being easy 500GB if not more.

Maybe in 10-15 years 8K will be usable but by then this cable will be extremely outdated.

 

 

HDMI started in 2002, when it was first created. It didn't become a standard until 2007. So five years it took to adapt. We still use it to this day in 2015. Nearly 13 years later. This cable will not be outdated, but revised in 10-15 years. And hate to break it to you, but in 5 years alone. There will be cards capable of delivering an 8K experience. GM200 alone will make 4K experience plausible with a single card, perfect with two. That could be released anytime within the next couple of months (Titan II). Now add progression for 4 more years? And you don't think 8K will be possible? Look back at graphics cards four years ago, and see what they were capable of. Then look to the massive leap we took in that span of time. 

Link to comment
Share on other sites

Link to post
Share on other sites

Yea exactly.

Multi-stream is eventually coming. But we might change to optical before that. But no one wants PentiumD. And once again, it doesn't change the fact that this has no sync. it has no wires for any sync ability, there it can't just be magically added, and it doesn't change the fact that if you use a 5K monitor today, you'll get tearing on anything that moves, hence everyone is avoiding like the plague the early 4K monitors as they are using multi-stream DisplayPort. At work, we play a lot with high resolution monitors, we have 4K laser projectors, and the latest GeForce and Quadros (we are using a lot of of Nvidia libraries and also CUDA), and we have a 4K MST monitor.. it's bad. It's not even funny. When you move a window you can see the tearing.

The wire doesn't need it. You could achieve this with a tiny ARM core in the monitor.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No point of 120Hz because it's 4:2:0, If it can do 8K 60Hz with 4:4:4 then it's awesome.

 

correct me if I'm wrong but i think displayport 1.3 can do that...

2017 Macbook Pro 15 inch

Link to comment
Share on other sites

Link to post
Share on other sites

Damn.


CPU: Intel i5 4570 | Cooler: Cooler Master TPC 812 | Motherboard: ASUS H87M-PRO | RAM: G.Skill 16GB (4x4GB) @ 1600MHZ | Storage: OCZ ARC 100 480GB, WD Caviar Black 2TB, Caviar Blue 1TB | GPU: Gigabyte GTX 970 | ODD: ASUS BC-12D2HT BR Reader | PSU: Cooler Master V650 | Display: LG IPS234 | Keyboard: Logitech G710+ | Mouse: Logitech G602 | Audio: Logitech Z506 & Audio Technica M50X | My machine: https://nz.pcpartpicker.com/b/JoJ

Link to comment
Share on other sites

Link to post
Share on other sites

Can humans see faster than 60Hz or not?

 

your refresh rate should be as high as possible.

 

If people want to share their "opinions" on how at xxx FPS or xxx Hz there is no difference they need to get educated... http://jgp.rupress.o...9.full.pdf html

 

The human eye is capable of detecting a single photon of light, the nerves that tell your brain the eye has detected some light require about 9 photons per 1ms, at higher densities of photons (virtually every condition aside from a light-tight room with a photon emitter) the time span to activate nerves shortens relative to the photons received.

 

So even in such a low light condition as to only provide 9,000 photons per second (to put this sensitivity into perspective - outside light provides about 400000000000 photons per second) we can perceive the difference at 1000fps - but its not enough light to get a "full response" from the nerves.

 

your black/white detectors (low light) are capable of full transition in sensitivity at roughly 300hz (you can imagine this as the full off to full on time taken, being about 3ms)

 

your colour detectors (bright light only) are capable of full transition in sensitivity at roughly 80hz (about 12.5ms for full off to full on)

 

This is why when its dark you cannot see colour (note: you have a concentration of colour receptors in the centre of your eye, this is why when you look at a star directly it is darker than when you look away from it and see it from the corner of your eye)

 

keep in mind smaller changes in photon density are more readily perceived, its not a 12.5ms delay to see colour and a 3ms delay to see brightness)

 

now... this is where it gets funky, the minds perception of vision is subjective and what you "see" is not actually what your eyes see, it is what your consciousness produces based on information provided by your eyes.... so what is actually happening at 60hz or 144hz or any other rate of refresh on your screen is your brain is essentially streaming you a continuous blurry image of what is happening

 

The same thing happens when you see rain, its not individual droplets you see but long streams of blurry... this is to compensate for the activation time of the rods and cones in your eyes and to take both colour data and brightness data (separate receptor types are used) and provide you with one view of the world...

 

so the higher the refresh rate, the more "accurate" your imagined view of the screen is compared to what is actually being generated by the game.

 

this is why its harder to catch a ball under 50hz fluro lights than it is to catch it outside in daylight because your brain is less accurately imagining where the ball is in space

Sim Rig:  Valve Index - Acer XV273KP - 5950x - GTX 2080ti - B550 Master - 32 GB ddr4 @ 3800c14 - DG-85 - HX1200 - 360mm AIO

Quote

Long Live VR. Pancake gaming is dead.

Link to comment
Share on other sites

Link to post
Share on other sites

because you can so see the difference between 420 and 444 on an average monitor outputing an average movie file

 

 

Welcome to LTT, where people can tell the different between 120Hz and 144Hz, so of course someone is gonna complain about this doing 8K @ 120Hz @ 4:2:

 

And I'm being a bit of a dick about this. What is the consumer impact of 4:2:0 to 4:2:2 to 4:4:4? What consumer even knows? This is stuff that creators worry about and consider. What the guys working on the next big rendering project. Those kinds of people who strive for utmost accuracy will give a damn. 

 

 

many may not, but an "enthusiast" would - and who do you think wants 8k 120hz? yep... enthusiasts.

 

I tested this with my 4k 60hz screen, toggled from 4:4:4 to 4:2:0 and the difference is intense

 

See the below image for the loss in fidelity when using sub 4:4:4 Chroma sub-sampling

 

 

 

attachicon.gifColorcomp.jpg

 

 

this is a still. try this on a video and tell me you see any difference

 

 

 

If you cannot notice the difference, there is either something wrong with your eyes or your monitor.

▶ Learn from yesterday, live for today, hope for tomorrow. The important thing is not to stop questioning. - Einstein◀

Please remember to mark a thread as solved if your issue has been fixed, it helps other who may stumble across the thread at a later point in time.

Link to comment
Share on other sites

Link to post
Share on other sites

The wire doesn't need it. You could achieve this with a tiny ARM core in the monitor.

How does it know which frames segment goes to which frame? Current frame or new frame?

Also, ARM processor aren't free.

Link to comment
Share on other sites

Link to post
Share on other sites

As amazing as that is, there is zero content available at that resolution at those hertz. Not a useless product though, because it'll be usefull for 4K stuff, but 8K 120Hz stuff...? Give it a few years, and a few more years for 120Hz content...

COMIC SANS

Link to comment
Share on other sites

Link to post
Share on other sites

How does it know which frames segment goes to which frame? Current frame or new frame?

Also, ARM processor aren't free.

No, but a single A57 core is about $10 when bought en masse.

 

That detail is architected early on. Each of the lines transports a particular fraction of the frame, and you just separate those lines at the point of the monitor to separate to different RAM modules. Then the chip just starts at stick 1 and sends the data to the panel pixel by pixel.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

No, but a single A57 core is about $10 when bought en masse.

 

That detail is architected early on. Each of the lines transports a particular fraction of the frame, and you just separate those lines at the point of the monitor to separate to different RAM modules. Then the chip just starts at stick 1 and sends the data to the panel pixel by pixel.

And why is it a problem with MST monitor, with only 2 lanes?

Link to comment
Share on other sites

Link to post
Share on other sites

And why is it a problem with MST monitor, with only 2 lanes?

Poor implementation most likely, and a 32-bit core lacking superscalar architecture.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Poor implementation most likely, and a 32-bit core lacking superscalar architecture.

And why do you think that is the case?

My vote is on cost. Everything you say adds cost. G-Sync is 200$... already people are having trouble spending that premium price.

Heck a 700-800$ is hard enough. Imagine another 200$ on top. Say R&D was free, and it was easy to implement, and we are looking at 50$ more (as price of thing are always rounded up so your 30$, say, is going to 50$, or 100$)... over that 700-800$ monitor. Not going to do well. And, it defeats the purpose of MHL which is supposed to be low powered.

So now phones needs a display controller with a processor of 64-bit (which I see no relation to, it can be 32-bit or even 16-bit, you are just syncing thing), and want super scalar? That sounds like a power hungry chip (in the phone space). A Vector processor could be used instead. Or better yet, the GPU can handle it, and that solves a heck of problems, but, sadly even our desktop fancy R290 or GTX 980 or TITAN don't support any sort of video sync.

Link to comment
Share on other sites

Link to post
Share on other sites

Can humans see faster than 60Hz or not?

Short answer: Yes

Longer answer: Our eyes do not have a "frame rate", not in the sense that computers and displays use. If you were in a pitch black environment and extremely bright light flashed into your eyes for 200th of a second its likely you will notice it (and your eyes will hurt). Of course that is an ideal situation, Lack of light would be far more difficult to detect however.

The cones and rods in our eyes are very easy and quick to "excite", but take a while to "calm down" from the "excited" state they get into when hit by light. - This is why we see dots after looking at bright light like a camera flash.

That was the simple answer according to my eye doctor. There is probably more to it than that.

Link to comment
Share on other sites

Link to post
Share on other sites

I have an amazing idea... how about it's not there, not because everything are idiots, but rather of cost.

too bad the cost is essentially nothing.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Short answer: Yes

Longer answer: Our eyes do not have a "frame rate", not in the sense that computers and displays use. If you were in a pitch black environment and extremely bright light flashed into your eyes for 200th of a second its likely you will notice it (and your eyes will hurt). Of course that is an ideal situation, Lack of light would be far more difficult to detect however.

The cones and rods in our eyes are very easy and quick to "excite", but take a while to "calm down" from the "excited" state they get into when hit by light. - This is why we see dots after looking at bright light like a camera flash.

That was the simple answer according to my eye doctor. There is probably more to it than that.

OMG if one more person answers this rhetorical question...

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

too bad the cost is essentially nothing.

The reality is not that, sadly. Hence why no one is implementing it.

But if you think you can for cheap dude, you can make some serious money.

Your idea seams cheap, and I can't see why you can't make it at home, I mean a prototype.

Link to comment
Share on other sites

Link to post
Share on other sites

If you cannot notice the difference, there is either something wrong with your eyes or your monitor.So you want us to compare 4:4:4 and 4:2:0 on Youtube which only supports 4:2:0? Are you serious?

Link to comment
Share on other sites

Link to post
Share on other sites

this topic:

 

new cable

...

..

.

..

...

RAAAGE

Sim Rig:  Valve Index - Acer XV273KP - 5950x - GTX 2080ti - B550 Master - 32 GB ddr4 @ 3800c14 - DG-85 - HX1200 - 360mm AIO

Quote

Long Live VR. Pancake gaming is dead.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×