Jump to content

Mojang officially cancels Minecraft Super Duper Graphics Pack due to poor performance

AlTech
Go to solution Solved by TechyBen,

AAAaaannd... I was right. Like they did not even wait a month. They cancelled "special fancy DLC textures" because, I'm assuming, Nvidia twisted their arm (with cash or promises of sales on Win10).

 

 

PS, I guess now I'm getting a 2080ti Super and making that Creeper Minecraft case I've always wanted to make... and naming it "RTX ON!!!".

AAAaaannd... I was right. Like they did not even wait a month. They cancelled "special fancy DLC textures" because, I'm assuming, Nvidia twisted their arm (with cash or promises of sales on Win10).

 

 

PS, I guess now I'm getting a 2080ti Super and making that Creeper Minecraft case I've always wanted to make... and naming it "RTX ON!!!".

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, TechyBen said:

AAAaaannd... I was right. Like they did not even wait a month. They cancelled "special fancy DLC textures" because, I'm assuming, Nvidia twisted their arm (with cash or promises of sales on Win10).

 

 

PS, I guess now I'm getting a 2080ti Super and making that Creeper Minecraft case I've always wanted to make... and naming it "RTX ON!!!".

Wasn't there a mod already that do this already?

But here it is nothing more than Ray Traycing, you don't have the new texture packs. RT is probably a month of work. The big point of RT for devs/studios is that they don't need to spend massive amount of time doing visual tricks anymore and get better and more accurate visuals.

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, GoodBytes said:

Wasn't there a mod already that do this already?

But here it is nothing more than Ray Traycing, you don't have the new texture packs. RT is probably a month of work. The big point of RT for devs/studios is that they don't need to spend massive amount of time doing visual tricks anymore and get better and more accurate visuals.

 

There was for Java that worked on any (powerful) GPU but now Microsoft is releasing an official one for Bedrock that needs RTX cards.

 

Also, In my experience, high end shaders need a good texture pack anyway for a few reasons.  One is subjective - it just looks better and seems imbalanced without it.  The other is quite objective and technical - most games already have a bunch of material property data for each surface, so ray tracing can just use this, but minecraft does not have any of this by default - the textures are just images.  Therefore, for the best experience, additional data must be added to all of those textures, including reflectivity, specular vs diffuse, etc.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, GoodBytes said:

Wasn't there a mod already that do this already?

But here it is nothing more than Ray Traycing, you don't have the new texture packs. RT is probably a month of work. The big point of RT for devs/studios is that they don't need to spend massive amount of time doing visual tricks anymore and get better and more accurate visuals.

 

I kinda agree. But it's still performance hogging. :P

So the kind of excuses probably hide/cover up the internal "politics" of the situation.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, GoodBytes said:

Wasn't there a mod already that do this already?

But here it is nothing more than Ray Traycing, you don't have the new texture packs. RT is probably a month of work. The big point of RT for devs/studios is that they don't need to spend massive amount of time doing visual tricks anymore and get better and more accurate visuals.

 

Next-Box will have hardware RT support. This is really for showing off comparisons against gaming PCs next year. It's also, likely, easy to add with their new render engine. 

 

https://www.minecraft.net/en-us/article/render-dragon-and-nvidia-ray-tracing

 

Minecraft will have an entirely new rendering engine, so the little offshoot to add RT support is going to be for marketing purposes. Nvidia probably had to do something to get this out the door, so at least something uses RTX that people actually play. Though given it's path tracing, it's going to be very cut down to work on any RTX card. Gen 1 RTX is just not ready for prime-time, and that'll be clear next year when we get the improved versions.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Taf the Ghost said:

Gen 1 RTX is just not ready for prime-time, and that'll be clear next year when we get the improved versions.

Personally doubt we'll see reasonably performant (in terms of both FPS and Rays Traced, at 1440p/4k) cards at reasonable prices until Gen 3/4.

 

It's a shame Nvidia made Pascal as good as it was, self-competition ftw.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, thorhammerz said:

It's a shame Nvidia made Pascal as good as it was, self-competition ftw.

I feel this, currently I see basically no reason to upgrade to this generation of cards from what I currently have.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, thorhammerz said:

Personally doubt we'll see reasonably performant (in terms of both FPS and Rays Traced, at 1440p/4k) cards at reasonable prices until Gen 3/4.

 

It's a shame Nvidia made Pascal as good as it was, self-competition ftw.

While RTX gen 1 won't have a long life, Turing kind of will. The solid DX12 support and more advanced Async will make it work well for a while, oddly enough.

 

As for RTX, even gen 2 will be a massive improvement. In discussions with people in the GPU space, Gen 1 RTX bottlenecks itself massive. Those "10 Gigarays" on the RTX 2080 Ti is more like 1.5 Gigarays in real workloads. It should be quite functional for real use in games in Gen 2, but so will AMD's approach. By Gen 3/4 of the tech, it's going to be standard in a lot of games because of how much easier it makes the lighting engine.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Taf the Ghost said:

It should be quite functional for real use in games in Gen 2, but so will AMD's approach. By Gen 3/4 of the tech, it's going to be standard in a lot of games because of how much easier it makes the lighting engine.

This is fair i'd imagine. Once the tech matures a bit and the performance hit isn't quite so harsh then sure, we can use it feasibly in games.

But what irks me the most is the premium that its bumped the level of graphics cards up by.

Top tier cards are now double the price, going from 6-700 for a Ti card, to basically double that. It's just not justified. You get a feature you probably won't ever realistically use in games, aside from 'trying it out' when you get the card and the performance of the card is nowhere near the difference in price over the outgoing generation(s).

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, SADS said:

This is fair i'd imagine. Once the tech matures a bit and the performance hit isn't quite so harsh then sure, we can use it feasibly in games.

But what irks me the most is the premium that its bumped the level of graphics cards up by.

Top tier cards are now double the price, going from 6-700 for a Ti card, to basically double that. It's just not justified. You get a feature you probably won't ever realistically use in games, aside from 'trying it out' when you get the card and the performance of the card is nowhere near the difference in price over the outgoing generation(s).

RTX features in Turing are functionally a Tech Demo. Even Gen2 of the tech could see a ~10x increase in output because it currently bottlenecks itself badly. Though the deeper issue is that AMD's (and, consequentially, MS and/or Sony's approach) looks like it'll be far more effective. Ray Tracing actually works better on a CPU rather than a GPU, so offloading a lot of it to a Software-based solution will see a massive performance uplift to allow the GPU to do the hardware-based tasks its much better at. "Hybrid Rendering" is going to be the Buzzword over the next few years.

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Taf the Ghost said:

Ray Tracing actually works better on a CPU rather than a GPU, so offloading a lot of it to a Software-based solution will see a massive performance uplift to allow the GPU to do the hardware-based tasks its much better at. "Hybrid Rendering" is going to be the Buzzword over the next few years.

I didn't know this. If this is the case, then i assume chips with more cores could potentially offer a lot more to gamers over more single threaded performer chips. I guess that's where AMD comes in with being the better offer.

exciting times!

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, SADS said:

I didn't know this. If this is the case, then i assume chips with more cores could potentially offer a lot more to gamers over more single threaded performer chips. I guess that's where AMD comes in with being the better offer.

exciting times!

For as much as I have to tell people "GPU pipelines aren't my thing", I really should read up on the technical details more. (I just interact with people that do focus on GPU pipelines and the like.) So, don't take all of this as gospel, but the entire pipeline for how a Ray is Traced (or a Path is Traced, since Minecraft is using Path Tracing) is actually mostly a single-threaded type of operation until the point you need to do the BVH Intersection calculation. That's where the real "hardware accelerated" aspect comes in, as you need fixed-function units to do those calculations efficiently. This is why this stuff ends up being done on GPUs, as they do the real crux of the entire pipeline much easier.

 

As a result, this is why all of this talk of RTX is actually just Nvidia's ways of handling Windows DXR. That's the framework that's necessary to make it work. It'll put a lot on the CPU as the major bottlenecks are removed in the GPU side of things with further generations of RT hardware. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Taf the Ghost said:

[...] So, don't take all of this as gospel, but the entire pipeline for how a Ray is Traced (or a Path is Traced, since Minecraft is using Path Tracing) is actually mostly a single-threaded type of operation until the point you need to do the BVH Intersection calculation. [...]

Regardless of if it's true or not (and I'd assume it is but again, regardless), I'd think each ray is effectively an independent task though, right?  So the parallelism comes in through the fact you're tracing 2M+ rays per frame, even if each one is a single thread.

Solve your own audio issues  |  First Steps with RPi 3  |  Humidity & Condensation  |  Sleep & Hibernation  |  Overclocking RAM  |  Making Backups  |  Displays  |  4K / 8K / 16K / etc.  |  Do I need 80+ Platinum?

If you can read this you're using the wrong theme.  You can change it at the bottom.

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Ryan_Vickers said:

Regardless of if it's true or not (and I'd assume it is but again, regardless), I'd think each ray is effectively an independent task though, right?  So the parallelism comes in through the fact you're tracing 2M+ rays per frame, even if each one is a single thread.

In theory, yes, but I believe in practice you're culling so much that the actual Ray counts really aren't that high. The issue is the way the interactions are calculated, which is where you want to do it on a GPU. It's also keeping track of all of those interactions that you end up needing that "ultra parallel" approach that a GPU provides.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×