Jump to content

Radeon VII neck and neck with RTX 2080 in rumored 3DMark leak

exetras
3 minutes ago, ZacoAttaco said:

Oh yeah good point. EU will get rekt, there seems to be little incentive to try and hunt one down if you live in that region. I've heard technology is expensive in the EU but I'm not really sure why. Is something to do with Brexit or Tariffs or similar?

We don't know AMD's internal product distribution. It could be high-end AMD GPUs sell terribly in EU but great in SE Asia. If true, they'd align the supplies accordingly. It could also be a rolling rollout, so this isn't quite a paper launch but just the first wave. Intel has done this for almost every generation of Skylake consumer parts lately, same with Nvidia and most of their high end GPUs. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Taf the Ghost said:

We don't know AMD's internal product distribution. It could be high-end AMD GPUs sell terribly in EU but great in SE Asia. If true, they'd align the supplies accordingly. It could also be a rolling rollout, so this isn't quite a paper launch but just the first wave. Intel has done this for almost every generation of Skylake consumer parts lately, same with Nvidia and most of their high end GPUs. 

Thanks, I wasn't fully aware of the situation. So between NVIDIA, Intel and AMD, they're contributing to this tough EU market. Especially that the Radeon VII will be a somewhat limited release card, it still doesn't seem like they'd be easy to get in the EU.

Link to comment
Share on other sites

Link to post
Share on other sites

One of the guys from overclockers in the UK posted this yesterday...

 

If all goes well tomorrow we have:
30+ Sapphire
20+ MSI
10+ Gigabyte
10+ Powercolor
6 Asus
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, IntMD said:

One of the guys from overclockers in the UK posted this yesterday...

 

If all goes well tomorrow we have:
30+ Sapphire
20+ MSI
10+ Gigabyte
10+ Powercolor
6 Asus
 

 

Yea and the next day he will be saying we still have the same stock to sell. Dont anyone want them?

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, CarlBar said:

Everytime i hear people go on about how ray tracing will never take off i want to strangle a kitten it's so frustrating.

I think you may want to get that checked out if an opposing view point is causing you that kind of frustration.

 

Also, RTX doesn't necessarily equal ray tracing in the context it's been used in this thread. People are poking fun at yet another proprietary Nvidia technology that is being pushed early for the purpose of exclusivity. Let's be real: it'll take a while before it has any chance of becoming mainstream and until it does it runs the risk of being killed or replaced with better solutions. This wouldn't be the first time early adopters get burned.

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Mira Yurizaki said:

DLSS should be a driver wide thing, like MFAA. I don't know what NVIDIA is doing making it a "must be explicitly supported" thing.

How do you NOT know what nvidiot is doing? what they are doing for 2 decades, GSync, Physx,Gameworks, RTX/DLSS, CUDA and so on are all locked to their hardware on purpose so you have to buy only their cards.

Lucky for us none of these technologies actually took off except CUDA, because in the real world developers wont be making 1 game engine for each vendor, 1 for nvidiot, 1 amd, intel upcoming GPU's, xbox, ps5, mobiles.

What almost always wins for general consumers is open standards, opengl/es, vulkan, opencl, cpu physics(instead of nvidiot locked gpu physx)  etc.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Taf the Ghost said:

Lol.

 

Per Steve at HUB, seems about 9 am EDT, if he was remembering right. So I think it's about 3 hours out from now.

Reviews!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

 

Yea I stayed up haha.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

Reviews!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

 

Yea I stayed up haha.

Haha

Link to comment
Share on other sites

Link to post
Share on other sites

41 minutes ago, IceCold008 said:

Yea and the next day he will be saying we still have the same stock to sell. Dont anyone want them?

Probably... however, I'm currently watching the amounts in stock going down quite rapidly... They've gone from 10+ in stock to 8 to 7 to 1 of the Sapphire ones in the last 5 mins. TBH, if you were going to get one, that is the one to go for as they're all the same and the Asus & MSI ones are £150 more..!

 

Edit... for some reason at least 5 of the Asus ones have been bought too...

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, VegetableStu said:

in the real world, devs use 3rd party engines that caters to multiple platforms with the same project ._.

no they dont implement features for each platform individually, implementing RTX just for nvidia when you have to release your game for 3 gpu vendors, between 1-3 operating systems (if we include mac,linux) and 2 console platforms makes RTX null, it doesnt have enough support multiplatform, its not supported on all platforms only windows, and it  costs too much to make 2 completely different rendering pipelines just for <1% of the gamers that will buy the game

and no devs dont use third party engines, most AAA have in house engine just a few have started using UE4 which doesnt have RTX yet, and indies and small teams wont bother implementing RTX anyway not enough time/money for that

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, schwellmo92 said:

2GB HBM2 stacks exist, they could still get the same performance with 4x2GB stacks, I'm not entirely sure how much cost that would save though, I would speculate if 4x4GB stacks cost $300 that 4x2GB stacks would cost $200.

Even if they actually exist on the market, that doesn't make the option any more feasible I M O (I doubt they cost half the price of the 4GB stack anyway, probably like 60-80% of the price). As far comparisons make it seem, AMD is just putting a gaming cooler on an MI50. To use less VRAM they'd have to make an entirely new product which, even if they cost less to make, is still just such low margin option compared to the instinct cards. The Radeon VII is already completely sold out which makes me think AMD just didn't need or want to have the card in inventory. They really aren't concerned with having gaming options available.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Trixanity said:

I think you may want to get that checked out if an opposing view point is causing you that kind of frustration.

 

Also, RTX doesn't necessarily equal ray tracing in the context it's been used in this thread. People are poking fun at yet another proprietary Nvidia technology that is being pushed early for the purpose of exclusivity. Let's be real: it'll take a while before it has any chance of becoming mainstream and until it does it runs the risk of being killed or replaced with better solutions. This wouldn't be the first time early adopters get burned.

 

It's not that it's an opposing viewpoint thats getting me frustrated. it;s that it';s a stupid viewpoint with zero basis in reality, facts, or any degree of common sense.

 

It takes willful ignorance to be that inherently wrong about something. And it';s the willful ignorance that annoys.

 

Also no it;s not just about NVIDIA's specific implementation, people are calling realtime raytracing in games period a dead end. Not everyone who criticizes RT, but a lot.

13 hours ago, leadeater said:

The pre-render part was in reference to those doom texture upscales where they are upscaled then you replace the texture game files so when the game load in those textures they are the upscaled ones. All the work that actually makes the game look better, these new better textures, were pre-generated.

 

That's the big difference between that and DLSS is that DLSS is done on every single frame, all of it, in real time.

 

Edit:

The only other time I would of reference any pre-computing in relation to DLSS would be the training of it, the application of the trained model that is run on our GPUs when playing the game is Inference that is run on the Tensor cores.

 

https://simpliv.wordpress.com/2018/08/14/what-is-ai/?utm_campaign=News&amp;utm_medium=Community&amp;utm_source=DataCamp.com

 

It takes a long time to learn to speak, or learn maths, or any other skill but once you have learnt it (even incorrectly) it's rather simple to apply it. We all hope we aren't applying incorrect knowledge. AI and Machine Learning isn't much different in that respect.

 

Yes it was in relation to inference. Which is the issue. Inference is not pre rendering and bears absolutely no resemblance to pre-rendering. By that theroy variable shader rate is prerendering. Textures are pre rendering. Model Geometries are prerendering, and so on and so forth.

 

12 hours ago, Mira Yurizaki said:

If DLSS is a post processing technique then the render is already a 2D image. So no, in theory it shouldn't be any more complicated than upscaling any other image . I can see some issues with the output producing a result that could confuse the upscaler but that's about it.

 

The thing that's getting to me is in a game there are an impractical number of scenes and angles to choose from. It's easy for a tech demo or a benchmark to have DLSS applied because it's nothing more than a movie where the frames are generated on the fly and it's going to be the same frames every time.

 

Seriously compare actual textures to the ingame images created by combining them with lighting, shadows, e.t.c. In all but the lowest fidelity games the final result of all that on your screen is a vastly more complex image than any of the individual elements, even though all of the individual elements may be being fed to the GPU at much higher resolutions than your display resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, CarlBar said:

 

It's not that it's an opposing viewpoint thats getting me frustrated. it;s that it';s a stupid viewpoint with zero basis in reality, facts, or any degree of common sense.

 

It takes willful ignorance to be that inherently wrong about something. And it';s the willful ignorance that annoys.

 

Also no it;s not just about NVIDIA's specific implementation, people are calling realtime raytracing in games period a dead end. Not everyone who criticizes RT, but a lot.

Show me someone who hasn't said something along the lines of (or the intent) "RTX something something Nvidia something proprietary bullshit vendor lock-in something AI something something needs industry standard to take off" or "ray tracing tanks performance and isn't ready yet". 

 

Unless you're referring to other people than in this thread in which case it isn't relevant to the thread.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Trixanity said:

Show me someone who hasn't said something along the lines of (or the intent) "RTX something something Nvidia something proprietary bullshit vendor lock-in something AI something something needs industry standard to take off" or "ray tracing tanks performance and isn't ready yet". 

 

Unless you're referring to other people than in this thread in which case it isn't relevant to the thread.

 

On 2/6/2019 at 2:30 PM, WikiForce said:

Ray tracing is a gimmick, not worth the performance loss with only 1 title in the market rn supported. It will just turn out like physx.

 

 

Page 1 and the post that started this whole discussion, (i quoted someone quoting someone else who was quoting this post).

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, yian88 said:

no they dont implement features for each platform individually, implementing RTX just for nvidia when you have to release your game for 3 gpu vendors, between 1-3 operating systems (if we include mac,linux) and 2 console platforms makes RTX null, it doesnt have enough support multiplatform, its not supported on all platforms only windows, and it  costs too much to make 2 completely different rendering pipelines just for <1% of the gamers that will buy the game

and no devs dont use third party engines, most AAA have in house engine just a few have started using UE4 which doesnt have RTX yet, and indies and small teams wont bother implementing RTX anyway not enough time/money for that

I agree with you that companies do vary from each platform sometimes, but UE4 does natively support DXR.

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, CarlBar said:

 

 

 

Page 1 and the post that started this whole discussion, (i quoted someone quoting someone else who was quoting this post).

Quite sure that's an RTX reference given the context (responding to claims that ray tracing makes it worth the purchase of RTX cards over V7 right now and referencing PhysX).

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, leadeater said:

Reviews!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

 

Yea I stayed up haha.

Wow you're not kidding, you really did haha.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, CarlBar said:

Yes it was in relation to inference. Which is the issue. Inference is not pre rendering and bears absolutely no resemblance to pre-rendering. By that theroy variable shader rate is prerendering. Textures are pre rendering. Model Geometries are prerendering, and so on and so forth.

I don't think you understood the point, the Inference model was applied to those textures before the game was run, that's pre-processing. The inference model is not run in real time during the game, in fact it's never run again unless you want to reprocess the textures again and replace them again.

 

For this specific application and how it relates to the discussion the upscaling of the textures is pre-computed, that has nothing to do with how they are actually used in the game.

 

DLSS the inference model is run in real time, every frame, huge difference.

 

Edit:

For that other discussion ages ago, the same applies. I don't know where you got this idea that you think I think Inference is pre-rendering. My comments were in relation to the training and the creation of the inference model that gets run on the GPU. An Inference model that is created on a per game basis for DLSS and optimized/fixed on a per game basis for RTX denoising. Both these factors exclude it's applicability from being able to be benchmarked reliably before you even take in to consideration it needs to run on any GPU from any vendor that supports the API (per vendor hardware acceleration paths allowed).

 

AI, Machine Learning specifically, operate on the principle of constant evolution and improvement and that is counter to how benchmarking works, if it isn't the same every time it isn't a benchmark. Hammering away at the Inference model on a giant cluster of 5000+ GPUs to find a faster version to get a higher benchmark score is in fact bringing in external computation power and invalidates every previous score generated. A driver improvement that makes the task run quicker doesn't alter the workload being run that generates the score.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, ZacoAttaco said:

Wow you're not kidding, you really did haha.

I'm awake now though ?

Link to comment
Share on other sites

Link to post
Share on other sites

So basically put, new card, same price as competitor, only more beneficial in productivity. But, for gaming, it doesn't matter and RTX has DLSS.

Still sticking with Nvidia lol.

*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to comment
Share on other sites

Link to post
Share on other sites

New..   AMD... GP...U       not Good enough...     at gaming....   mUst .. use... dying breath...    to .....          Shit....            on      RTX.... ?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, mr moose said:

New..   AMD... GP...U       not Good enough...     at gaming....   mUst .. use... dying breath...    to .....          Shit....            on      RTX.... ?

I see it just as a good alternative. Of course that depends on the price we gonna see this at stores and that also changes from country to country. For example there is no way we ll see this card in Greek stores for under 750-800 € whereas i can buy here the palit(cheapest at the moment) 2080 for 700 €.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Settlerteo said:
4 hours ago, mr moose said:

New..   AMD... GP...U       not Good enough...     at gaming....   mUst .. use... dying breath...    to .....          Shit....            on      RTX.... ?

I see it just as a good alternative. Of course that depends on the price we gonna see this at stores and that also changes from country to country. For example there is no way we ll see this card in Greek stores for under 750-800 € whereas i can buy here the palit(cheapest at the moment) 2080 for 700 €.

I also want to add that they just announced them at a big store here. The Sapphire one for 950 € and the MSI one for 900 €. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×