Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
AluminiumTech

Nvidia reps suggest 2060 and below will not have Ray Tracing support

Recommended Posts

3 hours ago, mynameisjuan said:

If the 2080ti can barely hit 60fps@1080 who the fuck cares if the 2060 doesnt support it. What you seriously going to play at 30fps@720?

And another one ignoring all the Stuff that is posted right on this topic.

 

60fps@1080p, without RT cores being used most likely. So more or less a representation of how it would run without RT, by brute forcing it.

Look up Enlisted Demo on Youtube. You will see a solid 100+ fps at 4k with RT on.

 

Edit, because this comes up SO MANY TIMES.

Timestamp is set at 2:28, at the start of the 4k Demo. FPS is shown at the top left all the time.

 

Link to post
Share on other sites
1 hour ago, mynameisjuan said:

They need to compete in non-ray tracing performance first. AMD is the one who needs to pull their head out

RTG doesn't have the R&D money that Nvidia has to just blow into stupid hardware implementations like "RT" cores. I would expect the new node and some architecture optimizations will be plenty to provide at least 2080 performance (keep in mind that in non ray tracing applications the 2080 will most likely perform around 1080 ti levels) at a much more competitive price around 500 to 600 dollars. If they can provide that by the end of the year while also implementing a sort of "ray tracing" that Nvidia has, then vega 7nm has the potential to disrupt Nvidia and create a healthy gpu market again.

 

38 minutes ago, DildorTheDecent said:

Please don't speak about the Holy Grail of PC Gaming like this. Remember that they can do no wrong. /s

 

Just wait until 2030 until they become viable again xD

I fully believe that AMD can do some stupid shit, most recent screw ups that I can think of would be vega frontier and the new athlon that they are releasing (due to not being unlocked), but the last time I remember AMD being great would be during the Hawaii v Kepler days. I fully believe that they have the potential to release a competent product now that they don't have to fight losing battles on 2 fronts from both Intel and Nvidia. 


8086k Winner BABY!!

My tech stuff

 

Main rig

Cpu: R5 1600 (4.0ghz 1.325v)

Mobo: Gigabyte b450m ds3h

Ram: 16gb geil 3000 16-18-18-18-36 @ 3200 16-12-19-15-24

Gpu: Gigabyte 1070 TI Gaming

 

LG G6 | Snapdragon 821 w Adreno 530 | 4gb ram | 32gb storage |

 

Link to post
Share on other sites
1 hour ago, mynameisjuan said:

They need to compete in non-ray tracing performance first. AMD is the one who needs to pull their head out

RTG doesn't have the R&D money that Nvidia has to just blow into stupid hardware implementations like "RT" cores. I would expect the new node and some architecture optimizations will be plenty to provide at least 2080 performance (keep in mind that in non ray tracing applications the 2080 will most likely perform around 1080 ti levels) at a much more competitive price around 500 to 600 dollars. If they can provide that by the end of the year while also implementing a sort of "ray tracing" that Nvidia has, then vega 7nm has the potential to disrupt Nvidia and create a healthy gpu market again.

 

43 minutes ago, DildorTheDecent said:

Please don't speak about the Holy Grail of PC Gaming like this. Remember that they can do no wrong. /s

 

Just wait until 2030 until they become viable again xD

I fully believe that AMD can do some stupid shit, most recent screw ups that I can think of would be vega frontier and the new athlon that they are releasing (due to not being unlocked), but the last time I remember AMD being great would be during the Hawaii v Kepler days. I fully believe that they have the potential to release a competent product now that they don't have to fight losing battles on 2 fronts from both Intel and Nvidia. 


8086k Winner BABY!!

My tech stuff

 

Main rig

Cpu: R5 1600 (4.0ghz 1.325v)

Mobo: Gigabyte b450m ds3h

Ram: 16gb geil 3000 16-18-18-18-36 @ 3200 16-12-19-15-24

Gpu: Gigabyte 1070 TI Gaming

 

LG G6 | Snapdragon 821 w Adreno 530 | 4gb ram | 32gb storage |

 

Link to post
Share on other sites

I don't know why this triple posted, mods could you please delete the copy.

Edited by TheDankKoosh

8086k Winner BABY!!

My tech stuff

 

Main rig

Cpu: R5 1600 (4.0ghz 1.325v)

Mobo: Gigabyte b450m ds3h

Ram: 16gb geil 3000 16-18-18-18-36 @ 3200 16-12-19-15-24

Gpu: Gigabyte 1070 TI Gaming

 

LG G6 | Snapdragon 821 w Adreno 530 | 4gb ram | 32gb storage |

 

Link to post
Share on other sites

My only hope for these cards is that they have the same performance increase as the 2080 and the same MSRP as the previous gen. I refuse to buy a 2050 if it’s more than 130 USD, if only for a regular generational leap.


Who needs fancy graphics and high resolutions when you can get a 60 FPS frame rate on iGPUs?

Link to post
Share on other sites
44 minutes ago, JediFragger said:

Dug this up;

 

https://www.pcgamer.com/nvidia-talks-ray-tracing-and-volta-hardware/

 

"While DXR will run on Nvidia's existing hardware, RTX will require a Volta (or later) GPU."

 

 

RTX is much more than just RT tho.

DLSS is part of RTX and only uses Tensor Cores.

I still doubt that a Titan V can be used to really do RT stuff. It may be able to brute force it like the BFV devs did it, but it won't be representative of anything.

Link to post
Share on other sites

If nvidia wants developers to support ray tracing, then they should have included it on their ENTIRE product line, otherwise dev support will be minimal or non existant just like we had with the now dying SLI. I also dont really get why they did this because nvidia likely has a patent on this stuff, so they may be able to collect rotalties for game engines that use ray tracing.

 

This doesnt seem like a very good move long term financially for nvidia, but it does seem like a quick way to save a buck but only short term. This will likely very much hobble the development of ray tracing.

 

Finally even after factoring ray tracing if the performance per dollar is not better than the previous generation, i will pass. My argument is what the hell is the point of buying newly released computer hardware if it doesnt give me more performance per dollar than the previous generation? The whole POINT of moores law is computers will become faster for the same price or the same speed for a lower price every few years. If that doesnt end up happening, then not a single sensebile person should buy ANY of the 2000 series GPUs. Why would you buy a more expensove product when i can get last generation for less?

Link to post
Share on other sites
20 minutes ago, CUDAcores89 said:

If nvidia wants developers to support ray tracing, then they should have included it on their ENTIRE product line, otherwise dev support will be minimal or non existant just like we had with the now dying SLI. I also dont really get why they did this because nvidia likely has a patent on this stuff, so they may be able to collect rotalties for game engines that use ray tracing.

8

You seem to have lived under a rock for the past weeks. May want to read up on what RTX is, what RT is and who was sitting on the table when it was finalized.

No seriously, it has been said so many times, you must have tried really hard to ignore it. Spoiler: It is not an NVidia thing.

 

20 minutes ago, CUDAcores89 said:

This doesnt seem like a very good move long term financially for nvidia, but it does seem like a quick way to save a buck but only short term. This will likely very much hobble the development of ray tracing.

5

Again, you should read up on the explanations before coming to conclusions which you don't even explain. 

You are arguing that it would be financially better for NVidia to push their own GPUs out of the market. We will need a seriously great explanation on why that would be a good idea financially.

 

20 minutes ago, CUDAcores89 said:

Why would you buy a more expensove product when i can get last generation for less?

3

Because the new generation has two distinct features that the old generation does not. Don't like them? No issue. Don't buy it.

Wait for the next GPU launch without these features if you prefer that.

Link to post
Share on other sites
14 hours ago, MyName13 said:

Is this RTX ray tracing possible only on Nvidia's GPUs (because it's their technology) or would software made with ray tracing in mind work on AMD's GPUs too?

Software? Lol, software doesnt fix or xontain everything. You'd get pretty low framerates without tensor cores, especially on software

Link to post
Share on other sites
8 hours ago, Tech Enthusiast said:

Yes, you wanted that. It seems like 35-60% are not substantially for you.

They aren't substantial when you factor in the price increase, and that number probably isn't even accurate.


Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  How to build a gaming PC for $400US or less   |  The Real Reason Delidding Improves Temperatures

Link to post
Share on other sites
4 minutes ago, Arika S said:

fuck them not not implementing a feature that the card wouldn't even be able to keep up with?

I'm 85% sure that was a troll comment. Since #intel.


LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Galaxy S9+ - XPS 13 (9343 UHD+) - Samsung Note Tab 7.0 - Lenovo Y580

 

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites

I don't really care about this honestly. If you're buying a 2060, you're on a budget wanting to play 1080p (or possibly now 1440p) at 60fps or so without gsync and such. If you are an enthusiast that plays 1440p fully detailed, you get a 2070 that can perform well with RTX (presumably). If you want to play 4K near max, you go with the 2080 or 2080Ti with the best ray tracing performance possible for now in gaming.

If GTX stays as the affordable "entry" level gaming option, that's fine by me. In order for RTX to be usable, you need a lot of horsepower to drive it. A card in the $300 or less range will suck at it. If you want RTX as a gaming feature or whatever, then you know which cards will do it for you.


*Insert Name* R̶y̶z̶e̶n̶ Intel Build!  https://linustechtips.com/main/topic/748542-insert-name-r̶y̶z̶e̶n̶-intel-build/

Case: NZXT S340 Elite Matte White Motherboard: Gigabyte AORUS Z270X Gaming 5 CPU: Intel Core i7 7700K GPU: ASUS STRIX OC GTX 1080 RAM: Corsair Ballistix Sport LT 2400mhz Cooler: Enermax ETS-T40F-BK PSU: Corsair CX750M SSD: PNY CS1311 120GB HDD: Seagate Momentum 2.5" 7200RPM 500GB

 

Link to post
Share on other sites
3 hours ago, Tech Enthusiast said:

You seem to have lived under a rock for the past weeks.

I think most of this forum is living under a rock, its the same old comments every time there is a thread about anything these days.  Most people just read wccfetch and listen to adordedtv and call themselves enthusiasts.  

 

It doesn't seem to matter what the topic is about but it's always the same old rhetoric that gets dug up every time.  Someone bangs on about AMD being hot and slow, then another moron posts a picture of Linuz Tovald's sticking his finger up (I am sure they don't even know why as that rant was 6 years ago), you get the same old hatred for apple, then the stupid ass financial analysis regarding Intel and the cost of processors.  It really is mind boggling how people just react and don't bother to question why first.   I mean, even if you don't know the difference between RT and DxR or what the tensor cores do or even the what the AI uses, it is still easy to find the basic information and see that its fairly run of the mill stuff (even exciting for those interested in RT and AI) as far as tech evolution goes.


QuicK and DirtY. Read the CoC it's like a guide on how not to be moron.  Also I don't have an issue with the VS series.

Link to post
Share on other sites

Personally, I am completely okay with this, keeps the price kind of down which is pretty important for the xx60 cards.


 

 

Link to post
Share on other sites

but will they charge you $359.99 for 2060?


Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to post
Share on other sites

Ray tracing, it will become super popular like other Nvidia only features like VXAO and PhysX... Specially now that it will be only supported by a small subset of the GPU market, developers will fall over each other to implement it!

 

RTX is basically a gimmick to rationalise the high cost of their hardware, whether it is useful or not.

Hint - My money is in the 'not useful' camp.

 

What pisses me off is that apparently the 2060 is only 25% or so faster than the 1060.

What pisses me off even more is that AMD has been pissing away their gaming GPU market share with really piss-weak, barely competitive products.


"Fighting for peace is like screwing for virginity"

- George Carlin (1937-2008)

Link to post
Share on other sites
15 hours ago, Curufinwe_wins said:

But the honest to goodness truth is that almost everyone has a ludicrously amount of financial waste in their lives that over the course of 2-6 months would easily cover the difference in price between a 200 dollars GPU and a 500-800 dollar one.

I'd rather keep eating eye fillet steak than put up with lesser steak, terrible life choices :).

Link to post
Share on other sites
2 hours ago, leadeater said:

I'd rather keep eating eye fillet steak than put up with lesser steak, terrible life choices :).

How much steak do you not want to be raytraced in your life!?!

Link to post
Share on other sites
22 minutes ago, GoldenLag said:

How much steak do you not want to be raytraced in your life!?!

Dunno about RayTraced but I do MouthTrace it.

Link to post
Share on other sites

So seems to be that new RTX 2000 series will be even bigger failure than GTX 400 series mainly because of unreal price points. GTX 1080/Ti is so much better deal it's not even funny. 

Link to post
Share on other sites
22 hours ago, MyName13 said:

Is this RTX ray tracing possible only on Nvidia's GPUs (because it's their technology) or would software made with ray tracing in mind work on AMD's GPUs too?

Ray tracing is a compute function. AMD GPUs are better at compute than Nvidia.  Also based on what AMD has told us so far the asynchronous compute capabilities of their cards allow them to run Radeon Rays 2.0 with less performance impact.

 

So they will outperform current Nvidia parts at ray tracing, but still be slower than Turing parts with dedicated RTX cores.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×