Jump to content

3D 120Hz 1440p & 4K monitors - WHEN??? Any news?

Guys I have been looking for since the start of this year for 1440p 3D monitors. Now, I see that the 1440p resolution is going standard now for gamers and then next gen is 4K which is now coming popular but unaffordable. So, like we have 3D on 1080p (which resolution is now in the last century), why is 3D not improving and moving on the new monitors with higher resolutions? One thing is true about it is that you need 7.3 million pixels/frame (2 times rendering) but if Nvidia and AMD are designing GPUs for 4K from now, then there MUST be enough power to feed 3D 1440p monitors. 3D gaming is now keeping a lot of gamers on 1080p.

 

So WHEN can I expect 1440p and 4K monitors to have 3D and G-Sync that work together? When is Nvidia gonna improve their technologies and make a GPU which will outperform 3x Kepler and 4x R9 290X, having all 3D games on 70FPS+ at both 1440p and 4K? 

 

One more thing - I see lots of TVs at higher frequency than monitors (which are only 60-120-144max Hz) and most of them are 3D. So why not have a monitor with 200Hz or 500Hz or even 1000Hz (that was a 4K tv i saw)?

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

Guys I have been looking for since the start of this year for 1440p 3D monitors. Now, I see that the 1440p resolution is going standard now for gamers and then next gen is 4K which is now coming popular but unaffordable. So, like we have 3D on 1080p (which resolution is now in the last century), why is 3D not improving and moving on the new monitors with higher resolutions? One thing is true about it is that you need 7.3 million pixels/frame (2 times rendering) but if Nvidia and AMD are designing GPUs for 4K from now, then there MUST be enough power to feed 3D 1440p monitors. 3D gaming is now keeping a lot of gamers on 1080p.

 

So WHEN can I expect 1440p and 4K monitors to have 3D and G-Sync that work together? When is Nvidia gonna improve their technologies and make a GPU which will outperform 3x Kepler and 4x R9 290X, having all 3D games on 70FPS+ at both 1440p and 4K? 

 

One more thing - I see lots of TVs at higher frequency than monitors (which are only 60-120-144max Hz) and most of them are 3D. So why not have a monitor with 200Hz or 500Hz or even 1000Hz (that was a 4K tv i saw)?

Whats the use for 1000hz? 500hz? They aren't very practical considering no one would really notice a difference.

Diamond 5 in League :)

Link to comment
Share on other sites

Link to post
Share on other sites

we're not psychics, how should we know?

"Probably Because I'm A Dangerous Sociopath With A Long History Of Violence"
 

Link to comment
Share on other sites

Link to post
Share on other sites

One more thing - I see lots of TVs at higher frequency than monitors (which are only 60-120-144max Hz) and most of them are 3D. So why not have a monitor with 200Hz or 500Hz or even 1000Hz (that was a 4K tv i saw)?

Just for clarifications, a 240Hz tv isn't even 240Hz. It's 60Hz and interpolated 4 times a second to make it look like it's 240Hz. Also because money.

.

Link to comment
Share on other sites

Link to post
Share on other sites

There are 120 Hertz 1440p monitors. 

I am talking about 120 Hz 3D 1440p monitors. 

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

Guys I have been looking for since the start of this year for 1440p 3D monitors. Now, I see that the 1440p resolution is going standard now for gamers and then next gen is 4K which is now coming popular but unaffordable. So, like we have 3D on 1080p (which resolution is now in the last century), why is 3D not improving and moving on the new monitors with higher resolutions? One thing is true about it is that you need 7.3 million pixels/frame (2 times rendering) but if Nvidia and AMD are designing GPUs for 4K from now, then there MUST be enough power to feed 3D 1440p monitors. 3D gaming is now keeping a lot of gamers on 1080p.

 

So WHEN can I expect 1440p and 4K monitors to have 3D and G-Sync that work together? When is Nvidia gonna improve their technologies and make a GPU which will outperform 3x Kepler and 4x R9 290X, having all 3D games on 70FPS+ at both 1440p and 4K? 

 

One more thing - I see lots of TVs at higher frequency than monitors (which are only 60-120-144max Hz) and most of them are 3D. So why not have a monitor with 200Hz or 500Hz or even 1000Hz (that was a 4K tv i saw)?

 

You just need a 120Hz monitor for 3D so just wait a bit.

Also if you see the difference after 120HZ you're just not human.

Link to comment
Share on other sites

Link to post
Share on other sites

I am talking about 120 Hz 3D 1440p monitors. 

A lot of 120Hz+ monitors are NVIDIA 3D Certified, so there's that filled. Almost any high end niche monitor would have the certification, so it's not much to worry about. I think the ROG SWIFT even is, but don't quote me on it. As for AMD users, oh well.

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

Omg that AMD....... 720p HD3D...really - disappointing

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

Do a lot of people actually use the 3D gaming feature?
I thought it was more of a failed attempt to boost sales than anything else.

I found it gimmicky. I think there will be more investment in VR than 3D going forward.

 

Probably 2 or 3 years until you see improvements in that range.

In a year 4k displays will be more affordable and you'll see a bigger push from AMD/Nvidia to make something that can power it.

 

TV's are bought by a much larger audience than high end computer monitors. Bigger numbers are seen as being better, thus garner more sales. This is why you see really high refresh rates.

Plasma tv's are the only tv I know of that goes over 480hz, and that's because of the technology behind them.

The tv's are 3D because they're hoping that it will help them sell. Nothing more.

 

You can get 240hz computer monitors.

But, correct me if I'm wrong, in order to make use of xxx hz, you have to be able to hit that frame rate with the video card no?

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Do a lot of people actually use the 3D gaming feature?

I thought it was more of a failed attempt to boost sales than anything else.

I found it gimmicky. I think there will be more investment in VR than 3D going forward.

 

Probably 2 or 3 years until you see improvements in that range.

In a year 4k displays will be more affordable and you'll see a bigger push from AMD/Nvidia to make something that can power it.

 

But, correct me if I'm wrong, in order to make use of xxx hz, you have to be able to hit that frame rate with the video card no?

 

Well not really many use it, but for gaming 3D is actually perfect. However, as you said, VR will take over, however VR will be with 3D support so basically 3D will continue on VR (Oculus Rift) rather than monitors. On TVs, 3D isn't really much used, not many people can watch 3D, it can hurt. So on TVs, yes they do make it just to get more sales.

 

Also, you are right - to use those Hz, you need to hit the frame rate and the performance with video cards is lacking a lot. 

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

All 120 Hertz Monitors are 3D if im correct. 

 

Sorry you are not quite there. If the monitor is 3D ready, then it can perform 3D vision and must be 120hz. Monitors which are 144Hz can also be 3D.

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

I find it gives me a headache. I have a couple other friends who find the same thing.

 

There's a few reasons I'm going to go with for them not putting out a card with 4x the power.

One, they can make more money by slowly releasing upgrades over time. This is only true if they were able to increase performance at that amount now, which I'm pretty sure isn't the case.

Two, with the shrinking of dies manufacturing becomes more difficult, so likely either prices will increase, or it will take longer to produce a stable product.

Three, if they were to simply increase the power given to a video card to boost performance you run into problems with heat, power bills, etc. No one wants to pay to power a 1000watt graphics card.

 

Things take research and development time. It's not like they're holding out on some 10x more powerful consumer grade graphics card.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

lemme just pull out my crystal ball right quick and tell the future -.-

 

Please don't ask questions like this, nobody knows the answer to it. 

 

 

 

Not many people play in 3d, many games are poorly designed to use the 3d since not a lot of people use it.

 

I've heard it gives some people headaches, and that certain games just really dislike 3d.   

@dizmo gets headaches from it for example.

Stuff:  i7 7700k @ (dat nibba succ) | ASRock Z170M OC Formula | G.Skill TridentZ 3600 c16 | EKWB 1080 @ 2100 mhz  |  Acer X34 Predator | R4 | EVGA 1000 P2 | 1080mm Radiator Custom Loop | HD800 + Audio-GD NFB-11 | 850 Evo 1TB | 840 Pro 256GB | 3TB WD Blue | 2TB Barracuda

Hwbot: http://hwbot.org/user/lays/ 

FireStrike 980 ti @ 1800 Mhz http://hwbot.org/submission/3183338 http://www.3dmark.com/3dm/11574089

Link to comment
Share on other sites

Link to post
Share on other sites

Sorry you are not quite there. If the monitor is 3D ready, then it can perform 3D vision and must be 120hz. Monitors which are 144Hz can also be 3D.

There is a pretty easy way to hack 120hz monitors to make it Nvidia 3D certified, since there is no extra hardware needed to make a 120hz monitor into a 3D ready one. It's software side.

 

Also, for your first questions. The 1440p 120Hz monitors on the market are the Korean panels which can probably be hacked to make it 3D certified (I don't see why not).

 

ASUS is coming out with a 1440p 120hz GSYNC TN monitor soon (probably this Q or Q2). 99% sure those will be 3D certified out of the box.

 

For 4K 120Hz, we would first have to wait for HDMI 2.0 and Displayport 1.3 both of which will come out this year. I wouldn't be surprised to see 4K TN 120Hz panels in Q42014 or Q12015.

 

IPS is a different story. Most likely, with the release of HDMI 2.0 and DP 1.3, we would see 4K 60Hz IPS monitors, but I'm sure the price on those will be insane. Heck, we don't even see 1080P 120Hz IPS monitors today.

 

I wouldn't even be surprised if OLED catches up in price by 2015/16.

Link to comment
Share on other sites

Link to post
Share on other sites

I find it gives me a headache. I have a couple other friends who find the same thing.

 

There's a few reasons I'm going to go with for them not putting out a card with 4x the power.

One, they can make more money by slowly releasing upgrades over time. This is only true if they were able to increase performance at that amount now, which I'm pretty sure isn't the case.

Two, with the shrinking of dies manufacturing becomes more difficult, so likely either prices will increase, or it will take longer to produce a stable product.

Three, if they were to simply increase the power given to a video card to boost performance you run into problems with heat, power bills, etc. No one wants to pay to power a 1000watt graphics card.

 

Things take research and development time. It's not like they're holding out on some 10x more powerful consumer grade graphics card.

 

Yeah, on 3D screens, headaches are normal however when the VR comes out with 3D, there will be no headaches and everyone will enjoy 3D. Watch the last Linus video of CES 2014. He says he doesn't have any problem watching 3D in VR even on 1080p. I never got any headache watching 3D movies in IMAX cinemas or playing 3D games with my friends on their PCs and consoles.

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, on 3D screens, headaches are normal however when the VR comes out with 3D, there will be no headaches and everyone will enjoy 3D. Watch the last Linus video of CES 2014. He says he doesn't have any problem watching 3D in VR even on 1080p. I never got any headache watching 3D movies in IMAX cinemas or playing 3D games with my friends on their PCs and consoles.

 

The reason why your head hurt while watching a 3D movie with polarized glasses (IIRC the therm) is because a bit of the image leak into the wrong eye so it confuse your brain.

However nVidia solution (AKA stereoscopic 3D) shouldn't give you headache.

Link to comment
Share on other sites

Link to post
Share on other sites

 

The reason why your head hurt while watching a 3D movie with polarized glasses (IIRC the therm) is because a bit of the image leak into the wrong eye so it confuse your brain.

However nVidia solution (AKA stereoscopic 3D) shouldn't give you headache.

 

OK, but one thing - 3D will not work with G-Sync now (together at same time), but do i really need it to work with G-Sync? What happens if I play on 3D vision game and the game is at around 30-50 FPS with G-Sync (NO stutter and lag) and not 60FPS? Do I really need the 60 FPS to get the best of 3D vision assuming that IMAX 3D cinemas are actually only 48FPS on a movie and the normal cinemas go at only 24FPS? Why is every gamer really targeting at 60 FPS? With only V-Sync on 60 FPS, I will be able to stop both stutter and tearing, but the only problem then is the performance required to achieve this on 1440p 3D monitor, knowing that the GPU will render 2 times 1 frame.

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

OK, but one thing - 3D will not work with G-Sync now (together at same time), but do i really need it to work with G-Sync? What happens if I play on 3D vision game and the game is at around 30-50 FPS with G-Sync (NO stutter and lag) and not 60FPS? Do I really need the 60 FPS to get the best of 3D vision assuming that IMAX 3D cinemas are actually only 48FPS on a movie and the normal cinemas go at only 24FPS? Why is every gamer really targeting at 60 FPS? With only V-Sync on 60 FPS, I will be able to stop both stutter and tearing, but the only problem then is the performance required to achieve this on 1440p 3D monitor, knowing that the GPU will render 2 times 1 frame.

 

The problem is that with stereoscopic 3D you need the display to run at 120Hz for the 3D to work because each eye will effectively have 60FPS so if you add G-Sync it will reduce that amount and it would break the experience.

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is that with stereoscopic 3D you need the display to run at 120Hz for the 3D to work because each eye will effectively have 60FPS so if you add G-Sync it will reduce that amount and it would break the experience.

Ok, now... 120Hz - 120 FPS...How could I possibly get that much on a 1440p screen??? :/ oh gosh don't tell me that's 4 way SLI... That's really big overkill performance requirement. OK, Let's hope that Maxwell will have as big jump of performance as efficiency it says on Nvidia's timeline graph.

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

Whats the use for 1000hz? 500hz? They aren't very practical considering no one would really notice a difference.

 

It makes strobe-free CRT motion clarity possible.   e.g. CRT clarity without the CRT flicker.

1ms persistence translates to 1ms of motion blur during 1000 pixels/second (e.g. panning/strafing/turning).

Frames are static, while eyes are continuously moving. As you track moving objects on a screen, your eyes are in a different position at the beginning of a visible refresh, than at the end of a visible refresh.  That creates motion blur as the static frame gets smeared across your retinas.  The shorter the persistence, the less motion blur, as persistence-based motion blur is like a camera shutter (60Hz sample-and-hold = 1/60sec of motion blurring = like panning a camera at 1/60sec shutter speed).  Sports photographs at 1/250sec, 1/500sec, and 1/1000sec camera shutter speeds are noticeable apart if the motionspeed is fast enough, and likewise, the same also applies to displays as well -- 120Hz is not the final frontier and not even 500Hz is, either.   Good educational animations include www.testufo.com/eyetracking and www.testufo.com/blackframes for those who want to understand persistence better.

 

You can either flash for 1ms (120Hz flashed 1ms each, with black gaps in between).  CRT phosphor, strobe backlight, light modulation.

Or you can have 1ms frames with no gaps between frames.  Completely strobe free, flicker free.  That automatically requires 1000fps (either real or via interpolation)

Both methods would have the same amount of motion blur.

 

It makes it easier to pass a theoretical holodeck turing test, "Wow I didn't know I was standing in Holodeck", because finite frame rates have side effects (stroboscopic effect, mousedropping effect, wagonwheel effect, motion blur, flicker versus motionblur tradeoff, etc) and going to true 1000fps@1000Hz would pretty much make low persistence possible without needing strobing.

 

Currently, all low-persistence displays all require light modulation (phosphor, light modulation, strobe backlight, etc), because there's no way to get 1ms persistence without light modulation at the time (unless you fill all 1ms timeslots -- aka 1000fps@1000Hz).

 

Some useful reading:

- Michael Abrash of Valve Software: Down the VR Rabbit Hole (he comments on 1000fps)

- Why We Need 1000fps @ 1000Hz This Century

- Understanding Persistence: Strobed & non-strobed, CRT vs LCD

- Educational Animations: www.testufo.com/eyetracking and www.testufo.com/blackframes

 

For example, popular strobe backlights (e.g. LightBoost, ULMB, BENQ Blur Reduction, Turbo240 etc) flash the backlight for as little as 1-2ms, once per refresh cycle.  The only way to match that low amount of motion blur is to fill all timeslots (2ms persistence would require 500fps@500Hz to be completely flickerfree/strobefree, and 1ms persistence would require 1000fps@1000Hz to be completely flickerfree/strobefree).

 

Conclusion: Stop spreading the myth that there is no benefit beyond 120Hz.

People like Oculus, John Carmack, Michael Abrash, Valve Softare, and myself of Blur Busters, all unamiously agree persistence is important, and we all recognize that the engineering challenge of 100% strobe-free low-persistence unavoidably requires ultrahigh frame rates.   So that's why all current low-persistence displays are light-modulated in some way (e.g. phosphor, flicker, black frames, strobing, etc).

 

NOTE: GtG (transition/movement state) is different from persistence (static/visible state)

Link to comment
Share on other sites

Link to post
Share on other sites

  • 3 weeks later...

 

 

1000fps@1000Hz

 

Well, I guess that won't be achievable for the next 3 years... 

 

I am waiting from 2010 to see GPUs that can deliver performance for stereoscopic 3D and interestingly enough, I still need 2 cards to get it at around 60-80 FPS... disappointing, and 2nd, why did AMD rebranded the cards, instead of going to a new dye and putting more performance in? Did they expect Mandle to do all the good boosting of FPS which now decreases the graphics quality (no one wants that) to increase FPS? Why not improve their 3D HD which is only 720p and get it higher than Nvidia? Really with the new DX12 now, AMD, you are screwed with that Mantle... only your cards work for coin mining...

 

As for the Oculus RIft - 1080p is understandable since it's new, but hurry and take it to the game developers already!!! PC games will not be good for porting to VR, neither console games - so make ALL Virtual Reality games Exclusive. Good luck this time to consoles! =) Oh poor PS4 - soon enough will be for the trash too... :D :D :D

 

Oh well...looks like I am left with no options for upgrade now:

- Get a 1080p 3D monitor and wait for Oculus Rift consumer grade and get it too?

- Wait for a 1440p 3D gaming monitor and get it & Oculus RIft c.g.?

- Only wait and get Oculus Rift and stay with my 900p 60Hz monitor?

Intel Core i9-9900K | Asrock Phantom Gaming miniITX Z390 | 32GB GSkill Trident Z DDR4@3600MHz C17 | EVGA RTX 3090 FTW3 Watercooled | Samsung 970 EVO 1TB M.2 SSD | Crucial MX500 2TB SSD | Seasonic Focus Plus Gold 1000W | anidees AI Crystal Cube White V2 | Corsair M95 | Corsair K50 | Beyerdynamic DT770 Pros 250Ohm

Link to comment
Share on other sites

Link to post
Share on other sites

We are really in the middle of a display technology revolution right now. Higher refresh rates, vsync variance rate displays (gsync),  virtual reality, much higher PPD displays. This comes on the back of lightboost and 3D and 144hz and ever smaller screens. Its all gone nuts in the last couple of years with so many promising near products, and most of us would like all of the tech in one display for $50. Its not an easy time to buy, each of these products is individually innovative but ultimately you are forced to choose or buy all of them, and they are all kind of expensive. I am expecting all the big stuff (gsync, 4k, OR) to drop this year at various points and various price points. All of them are going to be better and worse in certain scenarios (OR isn't any use in a game not designed for it for example) so its all going to be a trade off when it does drop its first iterations.

 

Right now I think 144hz monitors probably aren't worth buying, because gsync monitors will do it strobing better and gsync as well. Equally 1440p monitors are soon to be replaced with 4k ones in a similar state, but then later this year we will see 4k gsync monitors and maybe IPS based gsync monitors. I mean arrghhhh what the hell do you buy and when the market is like this? Too many good choices.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×