Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
exetras

Radeon VII neck and neck with RTX 2080 in rumored 3DMark leak

Recommended Posts

40 minutes ago, yian88 said:

no they dont implement features for each platform individually, implementing RTX just for nvidia when you have to release your game for 3 gpu vendors, between 1-3 operating systems (if we include mac,linux) and 2 console platforms makes RTX null, it doesnt have enough support multiplatform, its not supported on all platforms only windows, and it  costs too much to make 2 completely different rendering pipelines just for <1% of the gamers that will buy the game

and no devs dont use third party engines, most AAA have in house engine just a few have started using UE4 which doesnt have RTX yet, and indies and small teams wont bother implementing RTX anyway not enough time/money for that

https://en.wikipedia.org/wiki/List_of_Unreal_Engine_games#Unreal_Engine_4

 

if we're talking about RTX, wait till AMD cards does DXRT. Nvidia cards by then will start picking up again once everyone has some sort of hardware raytracing shader across the board (including devs), and everyone would have a much saner conversation.

 

also Unreal Engine is using DX12 Ratracing, which is what RTX is extending from (and AMD will most likely be designing around DX12 implementations). It'd won't be a stretch to generalise that RTX cards will be able to understand DX12's APIs.

 

also the point of third party engines like Unreal engine and Unity is for the developers to do all the hardware-software legwork so that game studios using them won't have to. AAA studios using their own code are already incurring hardware support and research cost*, and I'd won't be surprised if they decide of fall back to third-party engines.

Heck, Square Enix's Kingdom Hearts 3 runs on... Unreal Engine 4.

Unity is also working on integrating realtime raytracing.

 

also RVII reviews are out and I'm definitely not comfortable recommending it over a 2080 for the same price tag. maybe not even $50-$100 cheaper. we'll see when drivers on retail release are out.

 

*

Quote

https://en.wikipedia.org/wiki/Crystal_Tools

Crystal Tools entered development in August 2005 under the code name White Engine. It was intended for the then PlayStation 3-exclusive role-playing game Final Fantasy XIII. The decision to expand Crystal Tools' compatibility to other game projects and systems marked the official project start for a company-wide engine. Development was carried out by the Research and Development Division headed by Taku Murata, which was specifically established for this purpose. As Square Enix's biggest project to date, the creation of Crystal Tools caused substantial problems in the simultaneous production of several flagship titles; various critics cited the engine as the primary cause of significant delays in the release of Final Fantasy XIII.

 

there's definitely places where developing your own game engine would be better, but by then your life will start petering towards engine development than game development. Super Hexagon was coded entirely in C and the openGL APIs.

Link to post
Share on other sites
9 hours ago, schwellmo92 said:

2GB HBM2 stacks exist, they could still get the same performance with 4x2GB stacks, I'm not entirely sure how much cost that would save though, I would speculate if 4x4GB stacks cost $300 that 4x2GB stacks would cost $200.

Even if they actually exist on the market, that doesn't make the option any more feasible I M O (I doubt they cost half the price of the 4GB stack anyway, probably like 60-80% of the price). As far comparisons make it seem, AMD is just putting a gaming cooler on an MI50. To use less VRAM they'd have to make an entirely new product which, even if they cost less to make, is still just such low margin option compared to the instinct cards. The Radeon VII is already completely sold out which makes me think AMD just didn't need or want to have the card in inventory. They really aren't concerned with having gaming options available.


anime sucks

Link to post
Share on other sites
5 hours ago, Trixanity said:

I think you may want to get that checked out if an opposing view point is causing you that kind of frustration.

 

Also, RTX doesn't necessarily equal ray tracing in the context it's been used in this thread. People are poking fun at yet another proprietary Nvidia technology that is being pushed early for the purpose of exclusivity. Let's be real: it'll take a while before it has any chance of becoming mainstream and until it does it runs the risk of being killed or replaced with better solutions. This wouldn't be the first time early adopters get burned.

 

It's not that it's an opposing viewpoint thats getting me frustrated. it;s that it';s a stupid viewpoint with zero basis in reality, facts, or any degree of common sense.

 

It takes willful ignorance to be that inherently wrong about something. And it';s the willful ignorance that annoys.

 

Also no it;s not just about NVIDIA's specific implementation, people are calling realtime raytracing in games period a dead end. Not everyone who criticizes RT, but a lot.

13 hours ago, leadeater said:

The pre-render part was in reference to those doom texture upscales where they are upscaled then you replace the texture game files so when the game load in those textures they are the upscaled ones. All the work that actually makes the game look better, these new better textures, were pre-generated.

 

That's the big difference between that and DLSS is that DLSS is done on every single frame, all of it, in real time.

 

Edit:

The only other time I would of reference any pre-computing in relation to DLSS would be the training of it, the application of the trained model that is run on our GPUs when playing the game is Inference that is run on the Tensor cores.

 

https://simpliv.wordpress.com/2018/08/14/what-is-ai/?utm_campaign=News&amp;utm_medium=Community&amp;utm_source=DataCamp.com

 

It takes a long time to learn to speak, or learn maths, or any other skill but once you have learnt it (even incorrectly) it's rather simple to apply it. We all hope we aren't applying incorrect knowledge. AI and Machine Learning isn't much different in that respect.

 

Yes it was in relation to inference. Which is the issue. Inference is not pre rendering and bears absolutely no resemblance to pre-rendering. By that theroy variable shader rate is prerendering. Textures are pre rendering. Model Geometries are prerendering, and so on and so forth.

 

12 hours ago, Mira Yurizaki said:

If DLSS is a post processing technique then the render is already a 2D image. So no, in theory it shouldn't be any more complicated than upscaling any other image . I can see some issues with the output producing a result that could confuse the upscaler but that's about it.

 

The thing that's getting to me is in a game there are an impractical number of scenes and angles to choose from. It's easy for a tech demo or a benchmark to have DLSS applied because it's nothing more than a movie where the frames are generated on the fly and it's going to be the same frames every time.

 

Seriously compare actual textures to the ingame images created by combining them with lighting, shadows, e.t.c. In all but the lowest fidelity games the final result of all that on your screen is a vastly more complex image than any of the individual elements, even though all of the individual elements may be being fed to the GPU at much higher resolutions than your display resolution.

Link to post
Share on other sites
15 minutes ago, CarlBar said:

 

It's not that it's an opposing viewpoint thats getting me frustrated. it;s that it';s a stupid viewpoint with zero basis in reality, facts, or any degree of common sense.

 

It takes willful ignorance to be that inherently wrong about something. And it';s the willful ignorance that annoys.

 

Also no it;s not just about NVIDIA's specific implementation, people are calling realtime raytracing in games period a dead end. Not everyone who criticizes RT, but a lot.

Show me someone who hasn't said something along the lines of (or the intent) "RTX something something Nvidia something proprietary bullshit vendor lock-in something AI something something needs industry standard to take off" or "ray tracing tanks performance and isn't ready yet". 

 

Unless you're referring to other people than in this thread in which case it isn't relevant to the thread.

Link to post
Share on other sites
23 minutes ago, Trixanity said:

Show me someone who hasn't said something along the lines of (or the intent) "RTX something something Nvidia something proprietary bullshit vendor lock-in something AI something something needs industry standard to take off" or "ray tracing tanks performance and isn't ready yet". 

 

Unless you're referring to other people than in this thread in which case it isn't relevant to the thread.

 

On 2/6/2019 at 2:30 PM, WikiForce said:

Ray tracing is a gimmick, not worth the performance loss with only 1 title in the market rn supported. It will just turn out like physx.

 

 

Page 1 and the post that started this whole discussion, (i quoted someone quoting someone else who was quoting this post).

Link to post
Share on other sites
4 hours ago, yian88 said:

no they dont implement features for each platform individually, implementing RTX just for nvidia when you have to release your game for 3 gpu vendors, between 1-3 operating systems (if we include mac,linux) and 2 console platforms makes RTX null, it doesnt have enough support multiplatform, its not supported on all platforms only windows, and it  costs too much to make 2 completely different rendering pipelines just for <1% of the gamers that will buy the game

and no devs dont use third party engines, most AAA have in house engine just a few have started using UE4 which doesnt have RTX yet, and indies and small teams wont bother implementing RTX anyway not enough time/money for that

I agree with you that companies do vary from each platform sometimes, but UE4 does natively support DXR.

Link to post
Share on other sites
26 minutes ago, CarlBar said:

 

 

 

Page 1 and the post that started this whole discussion, (i quoted someone quoting someone else who was quoting this post).

Quite sure that's an RTX reference given the context (responding to claims that ray tracing makes it worth the purchase of RTX cards over V7 right now and referencing PhysX).

 

 

Link to post
Share on other sites
8 hours ago, leadeater said:

Reviews!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

 

Yea I stayed up haha.

Wow you're not kidding, you really did haha.

Link to post
Share on other sites
7 hours ago, CarlBar said:

Yes it was in relation to inference. Which is the issue. Inference is not pre rendering and bears absolutely no resemblance to pre-rendering. By that theroy variable shader rate is prerendering. Textures are pre rendering. Model Geometries are prerendering, and so on and so forth.

I don't think you understood the point, the Inference model was applied to those textures before the game was run, that's pre-processing. The inference model is not run in real time during the game, in fact it's never run again unless you want to reprocess the textures again and replace them again.

 

For this specific application and how it relates to the discussion the upscaling of the textures is pre-computed, that has nothing to do with how they are actually used in the game.

 

DLSS the inference model is run in real time, every frame, huge difference.

 

Edit:

For that other discussion ages ago, the same applies. I don't know where you got this idea that you think I think Inference is pre-rendering. My comments were in relation to the training and the creation of the inference model that gets run on the GPU. An Inference model that is created on a per game basis for DLSS and optimized/fixed on a per game basis for RTX denoising. Both these factors exclude it's applicability from being able to be benchmarked reliably before you even take in to consideration it needs to run on any GPU from any vendor that supports the API (per vendor hardware acceleration paths allowed).

 

AI, Machine Learning specifically, operate on the principle of constant evolution and improvement and that is counter to how benchmarking works, if it isn't the same every time it isn't a benchmark. Hammering away at the Inference model on a giant cluster of 5000+ GPUs to find a faster version to get a higher benchmark score is in fact bringing in external computation power and invalidates every previous score generated. A driver improvement that makes the task run quicker doesn't alter the workload being run that generates the score.

Link to post
Share on other sites

So basically put, new card, same price as competitor, only more beneficial in productivity. But, for gaming, it doesn't matter and RTX has DLSS.

Still sticking with Nvidia lol.


*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to post
Share on other sites

New..   AMD... GP...U       not Good enough...     at gaming....   mUst .. use... dying breath...    to .....          Shit....            on      RTX.... 😵


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites
2 hours ago, mr moose said:

New..   AMD... GP...U       not Good enough...     at gaming....   mUst .. use... dying breath...    to .....          Shit....            on      RTX.... 😵

I see it just as a good alternative. Of course that depends on the price we gonna see this at stores and that also changes from country to country. For example there is no way we ll see this card in Greek stores for under 750-800 € whereas i can buy here the palit(cheapest at the moment) 2080 for 700 €.

Link to post
Share on other sites
1 hour ago, Settlerteo said:
4 hours ago, mr moose said:

New..   AMD... GP...U       not Good enough...     at gaming....   mUst .. use... dying breath...    to .....          Shit....            on      RTX.... 😵

I see it just as a good alternative. Of course that depends on the price we gonna see this at stores and that also changes from country to country. For example there is no way we ll see this card in Greek stores for under 750-800 € whereas i can buy here the palit(cheapest at the moment) 2080 for 700 €.

I also want to add that they just announced them at a big store here. The Sapphire one for 950 € and the MSI one for 900 €. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×