Jump to content

WAN show rant about streaming video game data live

This is 100% an idea I had back in the late 1980s.  And it's still a great idea.  @LinusTech is just missing the vision thing.

 

Note that I don't know if this is what they're planning, but here's my idea:


Realtime Online Virtual Environments (I even named it.  ROVE).  It would be the World Wide Web equivalent for gaming and VR data.  No need to download anything to visit any virtual environment.  You jump in and the ROVE protocol serves you visual data in priority order, and adjusts it as you move around.  It would cache as much as it can, and it would use cache faulting to figure out what needs to be resent.


Dan is killing it with ideas that I had years and years ago on why this is a good thing.  In-game advertising is one.  No need to ever update is another.

 

But another big thing is it allows for MASSIVE environments.  Massive environments that can be connected to each other across servers, across different "games" or different "universes".  And I think the data needs would be far less than Linus imagines.  I designed this when 10Mbit ethernet was the the fastest most people had, and you could easily stream a 1988 3-D space on that old ethernet.  Especially with carefully designed systems that serve distant objects as very simply polygons and textures and only give you detailed textures for things that are very close.  Nothing new in that, but it would save a ton of network bandwidth.

 

But THE major use case that none of them considered, is that the environments could be COMPLETELY user-changeable at any time and in any way.  Do you own a virtual plot of land with a mountain?  Blow it up.  Or put a tunnel through it.  Or make an entire tunnel system.  Or replace it with a giant building.  Or a massive hole in the ground.

 

This actually led to the major technical sticking point I had in the design — when I was considering this in my head over thirty years ago — which was that these arbitrary changes could suddenly join two previously well-separated regions, in terms of visibility.  To efficiently deliver polygons and textures you have to know what is visible from any spot anywhere in the VR space.  That's an easy technical problem.  Binary space partitioning (BSP).  Solved problem.  But if you take two areas with lots of polygons that were totally invisible to each other because of a wall, and you put a hole in that wall, then you need to nearly instantly recalculate a massive part of your BSP tree.  At that time calculating all that visibility was very expensive. It's STILL pretty expensive but I expect it would be doable now.  There's also better efficient viewport algorithms now for more efficiently doing those calculations.

 

The other observation about that technical problem is that, for many VR uses, nobody would care.  Many good ideas have talked themselves out of existence by focusing on a problem that ultimately doesn't matter.  Tim Berners-Lee's great idea wasn't hypertext systems.  Those existed all over, and were discussed in academic papers.  But there was a problem: how do you efficiently handle dead/bad links?  HIs genius idea was to not bother solving it at all, because nobody would care.  So if you're in a VR space and knock down a wall, and everything on the other side of the wall is blank briefly while it recalculates, so what?  You could even make a big cloud of dust whenever there's a change like that, which would explain lack of visibility.

 

 

Again, I don't know if this particular game is going to do any of this.  But this is why you would want to predominately stream polygons and textures instead of delivering them ahead of time.  It's also an absolutely essential technology to create in order to achieve a real virtual reality universe of the kind many people have envisioned.  And it is probably something where an open standard would make all of our lives better.  I still have notes on the project somewhere.  (Note that designing a standard protocol for tranferring polygons and textures, audio, and user inputs; caching, and cache faulting, etc, is in my opinion not a thing that should be hard.  But... I've seen how another human being, supposedly intelligent, implemented the IMAP protocol, so yeah it can't be something that sucks like that.)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×