Jump to content

Nvidia Raytraces the Moon Landing

TacticalSquid
1 hour ago, mr moose said:

  From what I understand big hero 6 took months to render on huge render farms and it only had 127 traces per frame (happy to be corrected on that given I didn't look it up and operating on memory).

I was curious and found: https://www.fxguide.com/fxfeatured/disneys-new-production-renderer-hyperion-yes-disney/

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

55,000 Cores and 400TB of memory spread across the cloud based render farm.  That's a far cry from 4 quadros in a box.  

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, mr moose said:

 

Again that is working with completely controlled environments and pre rendered artwork. They didn't create toy story from a grainy 50year old barely understandable video.  From what I understand big hero 6 took months to render on huge render farms and it only had 127 traces per frame (happy to be corrected on that given I didn't look it up and operating on memory). This is an interactive recreation from that first video (meaning it recreates all the traces as you move the mouse).

 

 

 

 

ILM did a real time Star Wars ride with 4 Quatros rendering the lighting for a projector (those sit in a space ship type rides).

https://blogs.nvidia.com/blog/2018/03/28/walt-disney-imagineering-nvidia-star-wars-galaxys-edge-millennium-falcon/

 

"ILMxLAB and running in real time"

Also with NVidia and Epic Games. It was 8 GPUs... though it might have skipped the ray tracing, if using Unreal engine (unless some early RTX features... but seems probably not).

 

Big Hero 6 also had way more fidelity than anything "real time" in game. Look up Youtube. People were doing "real time" raytracing preview in Blender and other engines for production/development but it was really low res/choppy at times or required stupid rig setups.

 

For gaming, it was pointless (see the early Quake Raytracing water etc, that gave 15FPS for no point, but Intel wanted to push some of their architectures and needed a tech demo).

 

If you think RTX is comparable to Big Hero 6, then you are totally delusional! ?

"Look, this Elephant is also a mammal, so the same as a mouse!!!"

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, TechyBen said:

ILM did a real time Star Wars ride with 4 Quatros rendering the lighting for a projector (those sit in a space ship type rides).

https://blogs.nvidia.com/blog/2018/03/28/walt-disney-imagineering-nvidia-star-wars-galaxys-edge-millennium-falcon/

 

"ILMxLAB and running in real time"

Also with NVidia and Epic Games. It was 8 GPUs... though it might have skipped the ray tracing, if using Unreal engine (unless some early RTX features... but seems probably not).

 

Big Hero 6 also had way more fidelity than anything "real time" in game. Look up Youtube. People were doing "real time" raytracing preview in Blender and other engines for production/development but it was really low res/choppy at times or required stupid rig setups.

 

For gaming, it was pointless (see the early Quake Raytracing water etc, that gave 15FPS for no point, but Intel wanted to push some of their architectures and needed a tech demo).

 

If you think RTX is comparable to Big Hero 6, then you are totally delusional! ?

"Look, this Elephant is also a mammal, so the same as a mouse!!!"

I think you have completely missed the point I was making.

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, mr moose said:

I think you have completely missed the point I was making.

 

Yep. I still do not know what point you were making. Mine was that the current RTX implementation (low res raycasting of some elements for some types of effects/features) has been done before, but mainly only on over the top hardware, or in content creation. RTX brings it to the gamers with usability for the first time, but raytracing existed (see Quake Raytracing version back in... [checks] "Quake Wars: Ray Traced" ~ 2009 ).

 

Big Hero 6, and by extension some of the processes of Toy Story (1), were entirely different in implementation, scope and quality.

 

What was your point? I'm happy to hear it, but I don't know what it was. :(

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, TechyBen said:

Yep. I still do not know what point you were making. Mine was that the current RTX implementation (low res raycasting of some elements for some types of effects/features) has been done before, but mainly only on over the top hardware, or in content creation. RTX brings it to the gamers with usability for the first time, but raytracing existed (see Quake Raytracing version back in... [checks] "Quake Wars: Ray Traced" ~ 2009 ).

 

Big Hero 6, and by extension some of the processes of Toy Story (1), were entirely different in implementation, scope and quality.

 

What was your point? I'm happy to hear it, but I don't know what it was. :(

This is what I am saying:

 

1. in response to it not looking that crash hot:

On 7/21/2019 at 8:20 AM, mr moose said:

You are thinking of it like a computer game where the scenery and physics are all controlled engineered by the computer/artist, their primary goal here is to be as accurate as possible with the physical conditions, not make them look pretty by being able to control light angles, materials, etc. 

 

If you don't like the way a window looks in a game you can change it's refractive index and it will look better in the game,  they are not going to change the refractive index of space suits and valves etc just to make it look prettier, they are trying to recreate as realistic representation of what it actually looks like from the source material. They are trying to be as true to the original with very limited input data. 

 

 

It's Nvidia, if they wanted they could bend the the light, add some colors and shit and make the moon dust look amazing, but that's not their goal.

 

2:in response to the idea that they are doing something really easy with existing hardware from years ago.  (remember the ability to ray trace even a simple interactive scene has only become a thing for the average end users with RTX).

On 7/21/2019 at 9:10 AM, mr moose said:

 

 

>company invests millions of dollars using top end hardware/talent and puts their reputation on the line producing something.

>Internet random claims the output could have been done by a monkey in a basement.

 

If the end of 5 years work by people way more professional than us, using hardware we can only dream about and with Industry support from Nasa and Nvidia, is some how lack luster to our eyes, then maybe we are not considering half of what they have done and the limitations they might have faced.

 

3: pointing out the quality of the source material they were working with.

On 7/21/2019 at 11:15 AM, mr moose said:

didn't answer the question.

 

EDIT: Don't forget this is the quality of their source material:

 

 

that and a few higher res photo's.  It's not like they have access to a MAYA data bank of this stuff that they can just manipulate, they have to extract all the data from this.   

 

 

There was literally no point in doing this any earlier because up until now hardly anyone has had the hardware available to be able to use the interactive material.   Yes a company like ILM or animal logic could have recreated the videos, but that's all they'd be, videos.    Until RTX,  real time raytracing on consumer end equipment basically did not exist.

 

 

EDIT: and someone will likely try to mark this post as funny because they think the hardware did exist before RTX, except that it didn't, the ability to ray trace did, the ability to apply huge amounts of  path tracing in pre rendered videos using massive amounts of hardware did, but being able to carryout the ray tracing on the end users device as they control the angle of the camera has not been a thing domestically.  In fact many are arguing that even with RTX it is still not a thing due to performance.

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Quote

:in response to the idea that they are doing something really easy with existing hardware from years ago.  (remember the ability to ray trace even a simple interactive scene has only become a thing for the average end users with RTX).

This. This is the point I'm contending. We were raytracing entire scenes (not RTX implementation of only some elements) in the 90s. We were real time raytracing at least in 2010 at ~30 fps to 60 fps (again, not RTX only some elements, but all of them) . We did not use it for gaming though. As it offered nothing a game engine could not already "fake" at 120fps or more on the same hardware.

 

I can link videos of people doing real time ray tracing in blender/maya etc in 2010. No one really wanted to try doing it for games, as it'd take a few GPUs or more do do. :P

 

RTX is just lower quality raytracing (drop off raycasts to low res, only raycast some elements, blur/smooth results). But the silicon transistor count now means we are getting closer to doing it at "home", compared to a render farm... but I doubt we will get to the render farm equivalent, as the numbers (say 4k at 60fps) is just astronomical if raycasting every sub pixel (as you have to combine them for effects, so individual rays per pixel usually is not enough, you need to over sample at times, though some implementations use 1 ray, but multiple interactions).

 

So IMO it "existed", just not high res. Just not with the additional API features, and needed much more hardware. So yeah, not "consumer", but there were a lot of consumer tools. Price points have declined, not tech getting better (You can run it on a 1080ti, it runs, you just need more hardware to actually get good results).

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, TechyBen said:

So IMO it "existed", just not high res. Just not with the additional API features, and needed much more hardware. So yeah, not "consumer", but there were a lot of consumer tools. Price points have declined, not tech getting better (You can run it on a 1080ti, it runs, you just need more hardware to actually get good results).

Are you intentionally ignoring this bit:

56 minutes ago, mr moose said:

 (remember the ability to ray trace even a simple interactive scene has only become a thing for the average end users with RTX).

3: pointing out the quality of the source material they were working with.

 

 

There was literally no point in doing this any earlier because up until now hardly anyone has had the hardware available to be able to use the interactive material.   Yes a company like ILM or animal logic could have recreated the videos, but that's all they'd be, videos.    Until RTX,  real time raytracing on consumer end equipment basically did not exist.

 

 

EDIT: and someone will likely try to mark this post as funny because they think the hardware did exist before RTX, except that it didn't, the ability to ray trace did, the ability to apply huge amounts of  path tracing in pre rendered videos using massive amounts of hardware did, but being able to carryout the ray tracing on the end users device as they control the angle of the camera has not been a thing domestically.  In fact many are arguing that even with RTX it is still not a thing due to performance.

 

EDIT: I'm not even claiming RTX is the bees knees or anything, just that we haven't had anything in the domestic market capable of real time ray tracing until now,  pointing at render farms and extremely expensive work stations that could do half of it is not the same as the average joe being able to do it now on their own PC with just one GPU installed. (no matter how average the quality.

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

Are you intentionally ignoring this bit:

EDIT: I'm not even claiming RTX is the bees knees or anything, just that we haven't had anything in the domestic market capable of real time ray tracing until now,  pointing at render farms and extremely expensive work stations that could do half of it is not the same as the average joe being able to do it now on their own PC with just one GPU installed. (no matter how average the quality.

 

From 5 years ago:

RTX is between those two for resolution (of the rays). Much faster though. It's just more hardware, more transistors in the same space, and bringing the price point down a bit.

 

7 Years ago, tech demonstrated:

View port raytracing, basically scaled to the power of what GPUs could output back in 2016:

 

 

Finally, 9 years ago (I could not find the one I had, but this one will do), and on Quadros:

 

NVidia just scaled it with the price/cost and performance/power/heat refinements as the years have gone by. We've now hit the ability to do it cheaply(ish).

 

PS, looking it up, and RTX Titan use to be able to push about 1fps or so to the old Render viewports for raytracing. So we've hit 60x the performance over the years. Over about 5 or 6 years, that seems about right for GPUs. I'm just saying it's nothing special. It's just consequential of performance increases.

Link to comment
Share on other sites

Link to post
Share on other sites

Looks like for raw performance the Titan GTX vs RTX 2080 is about 2x - 3x on general compute. RTX has dedicated silicon for the raycasting. So no doubt they get a massive performance gain from that. The 1080ti does what, 25% or less of the performance with RTX on? So year, those things did exist for users previously... it was called a Titan or a 1080ti. ;)

 

But incremental improvements, and they have hit the 30 or 60 fps (depending on resolution). :P

 

 

(Sorry, I was probably way off with my "60x performance gain" comment though).

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, TechyBen said:

snip

NVidia just scaled it with the price/cost and performance/power/heat refinements as the years have gone by. We've now hit the ability to do it cheaply(ish).

 

PS, looking it up, and RTX Titan use to be able to push about 1fps or so to the old Render viewports for raytracing. So we've hit 60x the performance over the years. Over about 5 or 6 years, that seems about right for GPUs. I'm just saying it's nothing special. It's just consequential of performance increases.

 

Your videos and claims are the same as what I am saying,  you are talking about being able to do very limited work (in some cases 1fps) on extremely expensive hardware.  That's hardly a suitable PR stunt if only a handful of people can play with it.  Today there are literally millions of RTX devices out  there that can do this now (many of them for US$400). 

 

Again, there was no point in them doing this earlier because the hardware was just not available.

 

No point in trying to showcase what a product can do when no one has that product.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, mr moose said:

 

Your videos and claims are the same as what I am saying,  you are talking about being able to do very limited work (in some cases 1fps) on extremely expensive hardware.  That's hardly a suitable PR stunt if only a handful of people can play with it.  Today there are literally millions of RTX devices out  there that can do this now (many of them for US$400). 

 

Again, there was no point in them doing this earlier because the hardware was just not available.

 

No point in trying to showcase what a product can do when no one has that product.

NVidia did in one of those videos, years ago (the one with water and glass). But the price point/progression was not quick enough for them to release back then. Just saying it's nothing special. It's just natural progression (as now getting to 64 cores is amazing from AMD/Intel, but at this point, plain old slow progression).

 

PS, only got UK prices here, but, RTX card price: RTX 2060 £310

RTX 2060 FPS in Raytracing in BF5: 30fps average.

2080ti backported Raytracing: ~30fps average.

 

2080ti has been from £1200 to £800 now (though stock now gone I think?).

 

So yes, the price point crashed. So did 4k TV, smart phones, wireless headphones etc etc. Time goes by, and things get cheaper. But the tech, is nothing special.

 

IMO, it's also still not affordable RTX at the performance of a RTX 2060, and the poor quality of the raycasting (low res, very few features etc).

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, TechyBen said:

NVidia did in one of those videos, years ago (the one with water and glass). But the price point/progression was not quick enough for them to release back then. Just saying it's nothing special. It's just natural progression (as now getting to 64 cores is amazing from AMD/Intel, but at this point, plain old slow progression).

 

PS, only got UK prices here, but, RTX card price: RTX 2060 £310

RTX 2060 FPS in Raytracing in BF5: 30fps average.

2080ti backported Raytracing: ~30fps average.

 

2080ti has been from £1200 to £800 now (though stock now gone I think?).

 

So yes, the price point crashed. So did 4k TV, smart phones, wireless headphones etc etc. Time goes by, and things get cheaper. But the tech, is nothing special.

 

IMO, it's also still not affordable RTX at the performance of a RTX 2060, and the poor quality of the raycasting (low res, very few features etc).

 

So the question is what hardware would consumers use this on even 2 years ago? 

 

5 years work that was not possible to do affordably  on desktop until just recently is nothing special?

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

That video was so ugly looking..... seems like something maybe one single person spent max 1 week on.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

 

So the question is what hardware would consumers use this on even 2 years ago? 

 

5 years work that was not possible to do affordably  on desktop until just recently is nothing special?

 

 

Yep. ;)

 

AMD can do it too (there are those tech demos going around). It's progression. Like, yes. The 4 core and 8 core CPUs were great. Allowed things we could not do on 2 core CPUs... but the "ray tracing at 60fps" is still a fudge IMO. It's not 1080p raytracing (it's a quarter and scaled up), it's often not 60fps (it's temporally smoothed/limited to parts of the scene).

 

Show me full scene rendering at 60FPS, and I'd be amazed.

 

(I'm not using that video as a comparison to raster renderers, but just the RTX part for FPS/sample sizes. See the FPS crash at just a few ray samples in the scene).

 

Full scene raytracing is close to what those demos I showed give. It's close to the Titan/1080ti performances... it's close to incremental improvements, not something special.

People were double and quad GPUing a decade ago, right? Is it still amazing that our current GPUs give that 4x performance for half the price of 1 of those cards? Or just plain progression?

 

It's amazing we are now hitting the point consumer tech can hit those numbers/performances... but I still don't see it as special. It allows something like RTX, but it also allows anything else compute intensive.

 

This is PhysX all over again. Hype for a relatively mundane feature (a few months before/after PhysX people were running comparable results on CPU only physics, or on competitive brands/APIs... PhysX was just an extra compute chip... a chip/power you could get from anything/anywhere. Similar with RTX, people are paying more for it... they are paying more for the compute, it's just bundled in now).

Link to comment
Share on other sites

Link to post
Share on other sites

To say something isn't impressive today because it was done years ago without really thinking about what was going then I feel is kind of missing the point of what makes such things a breakthrough.  Otherwise let's just say everything in modern hardware isn't impressive. We've had some form of it years ago! Earlier I was mentioning to a friend that it's amazing today you can get a <$100 graphics card that has the same computing power as a super computer from almost two decades ago. Okay, but it's really not amazing because we had something that had that much compute power before within consumer pricing 10 some odd years ago.

 

NVIDIA's trying to make the Apollo with ray tracing tech demo sound amazing can be argued as laughable, but that doesn't take away what it can still do for games today. Every NVIDIA (and by extension ATI/AMD) tech demo was presented as "amazing" to some degree, because that's the point of a tech demo. It's to show off what hardware is capable of. And not just "it's only capable of doing this specific thing in this specific context," but to show off its capabilities in general. While you'd like to think tech demos are for trying to appease the PCMR's, they're just as much as trying to tell developers who work on graphics what that hardware is capable of doing. Or at the very least, what sort of things you can do and the demo is to prove that you can do it.

 

Remember, customers of hardware developers aren't necessarily the end user. If anything, I'd argue the primary customer are people who develop the software for that hardware.

 

In any case, it's unlikely we won't have fully ray traced scenes any time soon because of its extreme compute complexity. Even Disney's animation studios and Pixar doesn't use fully ray traced rendering. It's a combination of raster rendering and ray tracing. If you're wondering why not just keep with raster rendering, it's because to get raster rendering to do something of similar quality as ray tracing takes a lot of hacks and kludges. From what I've read, ray tracing is a stupid simple algorithm, the only thing that makes it complex is building the data set you need to feed it for efficient operation and the sheer number of things it needs to do.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

To say something isn't impressive today because it was done years ago without really thinking about what was going then I feel is kind of missing the point of what makes such things a breakthrough.  Otherwise let's just say everything in modern hardware isn't impressive. We've had some form of it years ago! Earlier I was mentioning to a friend that it's amazing today you can get a <$100 graphics card that has the same computing power as a super computer from almost two decades ago. Okay, but it's really not amazing because we had something that had that much compute power before within consumer pricing 10 some odd years ago.

 

NVIDIA's trying to make the Apollo with ray tracing tech demo sound amazing can be argued as laughable, but that doesn't take away what it can still do for games today. Every NVIDIA (and by extension ATI/AMD) tech demo was presented as "amazing" to some degree, because that's the point of a tech demo. It's to show off what hardware is capable of. And not just "it's only capable of doing this specific thing in this specific context," but to show off its capabilities in general. While you'd like to think tech demos are for trying to appease the PCMR's, they're just as much as trying to tell developers who work on graphics what that hardware is capable of doing. Or at the very least, what sort of things you can do and the demo is to prove that you can do it.

 

Remember, customers of hardware developers aren't necessarily the end user. If anything, I'd argue the primary customer are people who develop the software for that hardware.

 

In any case, it's unlikely we won't have fully ray traced scenes any time soon because of its extreme compute complexity. Even Disney's animation studios and Pixar doesn't use fully ray traced rendering. It's a combination of raster rendering and ray tracing. If you're wondering why not just keep with raster rendering, it's because to get raster rendering to do something of similar quality as ray tracing takes a lot of hacks and kludges. From what I've read, ray tracing is a stupid simple algorithm, the only thing that makes it complex is building the data set you need to feed it for efficient operation and the sheer number of things it needs to do.

 

I think people have just jumped on "it looks horrid" therefore a crap job because...            rather than ask why it looks horrid, which leads to a whole universe of industry conditions that are interesting and cool.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

29 minutes ago, mr moose said:

 

I think people have just jumped on "it looks horrid" therefore a crap job because...            rather than ask why it looks horrid, which leads to a whole universe of industry conditions that are interesting and cool.

And every tech demo I've seen of "interactive ray tracing" is a super simple scene. Nothing complex like what a typical game would have.

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 7/28/2019 at 8:39 PM, Mira Yurizaki said:

To say something isn't impressive today because it was done years ago without really thinking about what was going then I feel is kind of missing the point of what makes such things a breakthrough.  Otherwise let's just say everything in modern hardware isn't impressive. We've had some form of it years ago! Earlier I was mentioning to a friend that it's amazing today you can get a <$100 graphics card that has the same computing power as a super computer from almost two decades ago. Okay, but it's really not amazing because we had something that had that much compute power before within consumer pricing 10 some odd years ago.

 

NVIDIA's trying to make the Apollo with ray tracing tech demo sound amazing can be argued as laughable, but that doesn't take away what it can still do for games today. Every NVIDIA (and by extension ATI/AMD) tech demo was presented as "amazing" to some degree, because that's the point of a tech demo. It's to show off what hardware is capable of. And not just "it's only capable of doing this specific thing in this specific context," but to show off its capabilities in general. While you'd like to think tech demos are for trying to appease the PCMR's, they're just as much as trying to tell developers who work on graphics what that hardware is capable of doing. Or at the very least, what sort of things you can do and the demo is to prove that you can do it.

 

Remember, customers of hardware developers aren't necessarily the end user. If anything, I'd argue the primary customer are people who develop the software for that hardware.

 

In any case, it's unlikely we won't have fully ray traced scenes any time soon because of its extreme compute complexity. Even Disney's animation studios and Pixar doesn't use fully ray traced rendering. It's a combination of raster rendering and ray tracing. If you're wondering why not just keep with raster rendering, it's because to get raster rendering to do something of similar quality as ray tracing takes a lot of hacks and kludges. From what I've read, ray tracing is a stupid simple algorithm, the only thing that makes it complex is building the data set you need to feed it for efficient operation and the sheer number of things it needs to do.

Just no proper implementation yet... 12 months later. XD

 

This is more like full raytracing scene and effects:

 

Yep, at those frames, that's cool. Amazing speed in the hardware... not amazing *tech*. ;)

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/27/2019 at 9:46 AM, TechyBen said:

This. This is the point I'm contending. We were raytracing entire scenes (not RTX implementation of only some elements) in the 90s. We were real time raytracing at least in 2010 at ~30 fps to 60 fps (again, not RTX only some elements, but all of them) . We did not use it for gaming though. As it offered nothing a game engine could not already "fake" at 120fps or more on the same hardware.

 

I can link videos of people doing real time ray tracing in blender/maya etc in 2010. No one really wanted to try doing it for games, as it'd take a few GPUs or more do do. :P

 

RTX is just lower quality raytracing (drop off raycasts to low res, only raycast some elements, blur/smooth results). But the silicon transistor count now means we are getting closer to doing it at "home", compared to a render farm... but I doubt we will get to the render farm equivalent, as the numbers (say 4k at 60fps) is just astronomical if raycasting every sub pixel (as you have to combine them for effects, so individual rays per pixel usually is not enough, you need to over sample at times, though some implementations use 1 ray, but multiple interactions).

 

So IMO it "existed", just not high res. Just not with the additional API features, and needed much more hardware. So yeah, not "consumer", but there were a lot of consumer tools. Price points have declined, not tech getting better (You can run it on a 1080ti, it runs, you just need more hardware to actually get good results).

Back in 1988 I was using Traces on the Amiga to do ray tracing. It would take many hours to render a single frame at it highest resolution. Back then the juggler demo wowed around the world and because of that Blender exists today. Before Tracer there was Sculpt3D on the Amiga in 1987 and certain ideas for Blender came from that. Later we had Lightwave 3D which was used to render the ray traced scenes for Babylon 5 (The pilot episode) using a network of Amiga's. It took months of processor time to render a few seconds of video at a low resolution. Later episodes used a mixture of machines including DEC workstations, PCs and Mac's. Later on studios use large SGI render farms. Amazing to think that we can now have real time ray tracing at home in far higher resolution than I could have imagined back then. I downloaded the Star Wars demo when I first got my 2060, astonishing really.

Link to comment
Share on other sites

Link to post
Share on other sites

Yep. Just sometimes the marketing exaggerates. Other than that, it will be cool tech. Once some games come out tha use it. :P

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, slippers_ said:

wHeRe aRE ThE sTaRS HuEhUe nAsA fAKed MoOn lANdING

Looks like you need a new keyboard, it seems to type bollox.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, TechyBen said:

Just no proper implementation yet... 12 months later. XD

What's "proper implementation?"

 

7 hours ago, TechyBen said:

This is more like full raytracing scene and effects:

NVIDIA's video on Quake 2 RTX implied it was doing full path tracing, which would make sense considering the mod that inspired it was doing it too.

 

7 hours ago, TechyBen said:

Yep, at those frames, that's cool. Amazing speed in the hardware... not amazing *tech*. ;)

So then what's "amazing" tech to you?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

What's "proper implementation?"

 

NVIDIA's video on Quake 2 RTX implied it was doing full path tracing, which would make sense considering the mod that inspired it was doing it too.

 

So then what's "amazing" tech to you?

Actually, that's a good point. Quake 2 was the first example of full scene RTX. The Tomb Raider game also did it nice, but with a small improvement over the baked in previous effects.

 

Amazing is going from solid screen to a folding screen. Never seen that before (though IIRC there were examples at times, just really rare/expensive). Ray Tracing is old hat. Yep it's now reaching gaming... but as said, quadros and Titans existed before... it's just now slightly peaking over the extremely costly side.

 

But I'll concede. It's fantastic stuff... just I don't see it as amazing as others do. A bit like seeing the HD-1440p-4K slowly progress, it's no longer a surprise. Oh wow, someone's offering 8k now... like, I guess that's ok for them. :P

(See the SpaceX landings. Still amazing, but by this point, it's mundane as it's weekly! XD )

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×