Jump to content

So I just came from Instagram reels seeing people complain about Nvidia dropping support for Pascal and at least the non ray tracing Turing cards (10-16 series) and is it wrong to feel kinda like people are acting incredibly entitled and dumb here?

 

like trust me I get being sad a great card or family is losing support, I still have a 1060 and 1070 being used by my family but it’s also been a long goddamn time and while raw performance isn’t horrible by any means they just lack the tech we’ve been using for years and that the industry is adopting on mass (funilly enough due to consoles)

 

Nvidia is absolutely a scummy greedy company but this isn’t that, if they wanted they could and would have killed drivers back in like 2020 to encourage people to move away it’s just finally come time to cut support for hardware that can’t keep up (tech not performance wise) 

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/
Share on other sites

Link to post
Share on other sites

2 minutes ago, DJ_Jay125 said:

So I just came from Instagram reels seeing people complain about Nvidia dropping support for Pascal and at least the non ray tracing Turing cards (10-16 series) and is it wrong to feel kinda like people are acting incredibly entitled and dumb here?

 

like trust me I get being sad a great card or family is losing support, I still have a 1060 and 1070 being used by my family but it’s also been a long goddamn time and while raw performance isn’t horrible by any means they just lack the tech we’ve been using for years and that the industry is adopting on mass (funilly enough due to consoles)

 

Nvidia is absolutely a scummy greedy company but this isn’t that, if they wanted they could and would have killed drivers back in like 2020 to encourage people to move away it’s just finally come time to cut support for hardware that can’t keep up (tech not performance wise) 

People like to complain and find things to complain about.

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16647995
Share on other sites

Link to post
Share on other sites

19 minutes ago, DJ_Jay125 said:

So I just came from Instagram reels seeing people complain about Nvidia dropping support for Pascal and at least the non ray tracing Turing cards (10-16 series) and is it wrong to feel kinda like people are acting incredibly entitled and dumb here?

 

19 minutes ago, DJ_Jay125 said:

Nvidia is absolutely a scummy greedy company but this isn’t that

Both can be true at once - The view that NVIDIA is especially scummy & greedy when considering the 50 series pricing & performance is just adding fuel to a fire regarding anything else inflammatory they could do, in this case stopping support of some more older hardware.

 

I sold my GTX 1060 a while back since I found a GTX 680 as an emergency backup if my RTX card ever stopped working, and even though my card stopped getting support over a year ago, I swapped it out once and it was okay with games that it met min spec for.

 

So dropping support for 10/16 series doesn't upset me any more since I figure it can't be any worse than my GPU thats still working well after over 1 year of no support.

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648006
Share on other sites

Link to post
Share on other sites

1 hour ago, DJ_Jay125 said:

they just lack the tech we’ve been using for years

WE'VE  been using..... is where your thought process was flawed.
 

A lot of new car reviewers also get this wrong as well. They speak as if everyone else on earth (WE) is exactly the same as them and is bored and tired with anything they have seen before and has constant free access to new tech.

Most people , don't really see much innovation between graphics from 2016 and now. To most people , a lot of games that came out 8 years ago that run fine on their 8 year old gpu look fine today. So it tends to be annoying when a company tells people to "join the future" and spend hundreds of dollars on a part that doesn't change anything.

I know nobody here can possibly fathom the idea that the majority of people maybe don't care about any of the advancements made in the last 8 years and are instead just sick of updating hardware to play games that are already older than their current hardware anyway. But ffs just try and imagine it for two seconds....

The problem is their old gpus.... DO work good enough.... and the only thing that will prevent them from working , is the company dropping support.

I 'm not going to ever pat the most profitable company on earth on the back for dropping software support. They can afford to do everything. They can release drivers for these gpus for the next 1000 years and never cut into their bottom line. They are killing off driver support to kill off their older products from being functional so you buy new ones. Thats the only reason.

Never assume a multi billion dollar company that is killing off your old hardware has your best interests in mind.

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648018
Share on other sites

Link to post
Share on other sites

1 hour ago, DJ_Jay125 said:

So I just came from Instagram reels seeing people complain about Nvidia dropping support for Pascal and at least the non ray tracing Turing cards (10-16 series) and is it wrong to feel kinda like people are acting incredibly entitled and dumb here?

 

like trust me I get being sad a great card or family is losing support, I still have a 1060 and 1070 being used by my family but it’s also been a long goddamn time and while raw performance isn’t horrible by any means they just lack the tech we’ve been using for years and that the industry is adopting on mass (funilly enough due to consoles)

 

Nvidia is absolutely a scummy greedy company but this isn’t that, if they wanted they could and would have killed drivers back in like 2020 to encourage people to move away it’s just finally come time to cut support for hardware that can’t keep up (tech not performance wise) 

I get why they're going to be dropping support. I just think people are complaining that Pascal didn't have a buffer window. Like for the 900 series the 700 got discontinues first. Like the just knocking out 900 and 1000

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648054
Share on other sites

Link to post
Share on other sites

2 hours ago, emosun said:

WE'VE  been using..... is where your thought process was flawed.
 

A lot of new car reviewers also get this wrong as well. They speak as if everyone else on earth (WE) is exactly the same as them and is bored and tired with anything they have seen before and has constant free access to new tech.

Most people , don't really see much innovation between graphics from 2016 and now. To most people , a lot of games that came out 8 years ago that run fine on their 8 year old gpu look fine today. So it tends to be annoying when a company tells people to "join the future" and spend hundreds of dollars on a part that doesn't change anything.

I know nobody here can possibly fathom the idea that the majority of people maybe don't care about any of the advancements made in the last 8 years and are instead just sick of updating hardware to play games that are already older than their current hardware anyway. But ffs just try and imagine it for two seconds....

The problem is their old gpus.... DO work good enough.... and the only thing that will prevent them from working , is the company dropping support.

I 'm not going to ever pat the most profitable company on earth on the back for dropping software support. They can afford to do everything. They can release drivers for these gpus for the next 1000 years and never cut into their bottom line. They are killing off driver support to kill off their older products from being functional so you buy new ones. Thats the only reason.

Never assume a multi billion dollar company that is killing off your old hardware has your best interests in mind.

The old GPUs do work good enough, right now, and for everything out right now they will continue to work in

and yes I get your point about no one caring about new tech the issue is Nvidia and game devs (or probably really publishers but whatever) do care and use Ray Tracing and thats not going to be a option in modern AAA games for very long and in that case Pascal just literally cant work. 

Also- losing driver support doesn't really ever break compatability with things the cards already worked with it just means newer games might not but again thats kinda given when games that require ray tracing (which is happening already) just literally don't work without heavy modding

 

i'm not saying Nvidia is a good guy or anything I just think its been damn near 10 years, technology and the industry have changed finally and while Pascal is still fine enough today its not enough for tomorrow, sometimes you do eventually have to upgrade

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648088
Share on other sites

Link to post
Share on other sites

1 hour ago, DJ_Jay125 said:

I just think its been damn near 10 years, technology and the industry have changed finally

I.... don't know if you noticed this , but tech/computers has change the LEAST it ever has the past 10 years.

Which is why you have people defending the use of 10 year old hardware. Normalizing the use of decade old hardware is the result of stagnation in the industry that is a new trend that was never common in decades past.

The amount of progress made these days is NOTHING compared to the difference between a 33mhz 486 and a 1.1ghz pentium 3. Nobody ever wanted to keep using old hardware due to stagnation. You never had to break out the magnifying glass to see the difference.


It's absolutely silly to me to see someone think anything beyond incremental updates in local hardware has been made in recent years.

You could walk past someone playing a 10 year old game these days and not even notice. Why would you notice? Nothing about the game would even be different enough to warrant attention.

1 hour ago, DJ_Jay125 said:

sometimes you do eventually have to upgrade

well let me know when you've bought a holographic wifi enabled spoon

if you use a metal spoon well you're just stuck in the past

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648113
Share on other sites

Link to post
Share on other sites

8 minutes ago, emosun said:

I.... don't know if you noticed this , but tech/computers has change the LEAST it ever has the past 10 years.

Which is why you have people defending the use of 10 year old hardware. Normalizing the use of decade old hardware is the result of stagnation in the industry that is a new trend that was never common in decades past.

The amount of progress made these days is NOTHING compared to the difference between a 33mhz 486 and a 1.1ghz pentium 3. Nobody ever wanted to keep using old hardware due to stagnation. You never had to break out the magnifying glass to see the difference.

 

I mean yeah, I never pretended tech has moved fast but we are in a different place in tech in 2025 than we were in 2016. Hell remember back then you were rich on the high end to use more than 4 cores and now even 6 core chips are considered budget.

 

also when the hell did I say people have to abandon Pascal? if it does everything you want it to then please stay and wait for there to be a even bigger performance difference you shouldn't ever upgrade unless youre unhappy with your current stuff but the problem is we have seen changes and improvements in tech. 

12 minutes ago, emosun said:

You could walk past someone playing a 10 year old game these days and not even notice. Why would you notice? Nothing about the game would even be different enough to warrant attention.

This is horseshit and I hope you see it

 

look at GTA 5 vs Cyberpunk 2077 or The Amazing Spider-Man 2 on PS4 vs Marvels Spider-Man 2 on PS5

Tech has slowed you're correct but to pretend like we haven't see graphical upgrades is stupid

 

I'm not trying to say anyone needs to upgrade from pascal, if it does everything you want it to keep using it the lose of new drivers wont fuck with that. but if you want to play the newest games which need the newest features you do need to upgrade, like it or not tech has progressed, no where near as much as the 90s youre right but it also hasn't been anywhere nearly as stagnant as youre pretending it is.

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648119
Share on other sites

Link to post
Share on other sites

2 minutes ago, DJ_Jay125 said:

Hell remember back then you were rich on the high end to use more than 4 cores and now even 6 core chips are considered budget.

My machine at the time was 12 cores , and it was 7 years old. You might need to do some brushing up on history.

5 minutes ago, DJ_Jay125 said:

it also hasn't been anywhere nearly as stagnant as youre pretending it is.

Name a time when it was more stagnant in the past 30 years.

I'll even give ya a little help , 1980-1989 most people used 1mhz systems with little to no changes seeing almost zero major advancements worth upgrading to due to lack of interest in home computers in general.

Now you pick up the reins and let me know when pc hardware was more stagnant after that.

 

10 minutes ago, DJ_Jay125 said:

look at GTA 5 vs Cyberpunk 2077 or The Amazing Spider-Man 2 on PS4 vs Marvels Spider-Man 2 on PS5

super mario world and halo are 1 decade apart. Is gta 5 2D? I forgot. Maybe it is i haven't played in a while

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648124
Share on other sites

Link to post
Share on other sites

31 minutes ago, emosun said:

My machine at the time was 12 cores , and it was 7 years old. You might need to do some brushing up on history.

Name a time when it was more stagnant in the past 30 years.

I'll even give ya a little help , 1980-1989 most people used 1mhz systems with little to no changes seeing almost zero major advancements worth upgrading to due to lack of interest in home computers in general.

Now you pick up the reins and let me know when pc hardware was more stagnant after that.

 

super mario world and halo are 1 decade apart. Is gta 5 2D? I forgot. Maybe it is i haven't played in a while

You’re making honestly garbage arguments back to back to back 


this isn’t the most stagnant time in computing as there is slow progress, I’m not saying people should buy a 5080 when they have a 4080 (or honestly a 3080 too lol) but since Pascal first came out in 2016 and 2017 there has been changes in power and more importantly and the reason why (besides greed, Nvidia is greedy never said they weren’t) Nvidia is cutting support is Ray Tracing is starting to become something the industry wants as standard, even if users don’t care newer games are requiring it and because of that it makes sense to cut support for 10 and 16 series, didn’t see you get this upset when AMD killed Polaris years back

 

Also- just because GTA 5 isn’t 2d doesn’t mean it doesn’t look less polished than newer games. We are still getting improvements visually to gaming.

 

i honestly don’t know if you get my point here at all, the industry is slowing down and stagnating and nvidia is greedy as hell but at the same time if you want the newest stuff you need newer toys

or I mean should Nvidia still be supporting the 980? Or the 280? Or the 8800? Or the riva TNT2? Tech is moving slowly but it’s still moving

 

if you don’t give a shit and really think the industry is stagnant, if your 1060 or 1080ti is still good enough for you more power to you! I’ve never been here saying people need to upgrade. I still have 1060s and 1070s in systems today because for my little sisters they’re plenty. 
 

it’s just support isn’t going to last forever, new drivers aren’t going to last forever, it’s been 9 years since the 1070 and 1080 came out and they’ve had a good run, they’ve gotten support any other card would dream about, but as devs use and sometimes mandate ray tracing, or DLSS or whatever older cards are going to come up short and yeah, Nvidia doesn’t want to try and force another few years out of cards not comparable with anything they’re working on today so they cut driver support

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648145
Share on other sites

Link to post
Share on other sites

I think it's reasonable to expect at least 10 years of support for GPUs. 1000 series released in summer 2016, so we're not quite there yet.

 

This seems like a push to try and make these older cards insufficent for modern games by stopping optimization.

 

That being said, the people that do play old games and still have these GPUs will still be able to play these games. Nothing changes. The GPUs will also still work in brand new games, we just won't receive specific optimizations anymore. To be fair, the kind of people that constantly play the newest AAA games aren't the ones still holding on to their 1060 because performance isn't good enough anyway.

 

I have a buddy with a 1070 and he really has to fight with graphics settings to achieve anything close to 1080p 60 fps in newer games, which is imo the minimum requirement for a good experience.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648153
Share on other sites

Link to post
Share on other sites

26 minutes ago, DJ_Jay125 said:

You’re making honestly garbage arguments back to back to back 

 

pretty glad you typed that followed by

 

25 minutes ago, DJ_Jay125 said:

Nvidia is cutting support is Ray Tracing is starting to become something the industry wants as standard

lets hear the long list of ray tracing games that require it

i mean the list of games must so long that it exceeds the ability to even type them all out. God there must be thousands of games that require it.

 

29 minutes ago, DJ_Jay125 said:

or I mean should Nvidia still be supporting the 980? Or the 280? Or the 8800? Or the riva TNT2?

yes.

 

see how you typed something that would only yield a good result? software support is a GOOD thing. The ONLY people that want software support to stop , are the assholes that profit from it stopping.

and if you're THAT far gone to just be drinking whatever koolaid a multi billion dollar company forces down your throat to think they can't afford basic compatibility updates , then there's no convincing you of anything.

Nvidia employee of the month , maybe you'll get a "good job" sticker and BMW will have you defend their heated seat subscriptions next.

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648154
Share on other sites

Link to post
Share on other sites

16 minutes ago, DJ_Jay125 said:

 

or I mean should Nvidia still be supporting the 980? Or the 280? Or the 8800? Or the riva TNT2? Tech is moving slowly but it’s still moving

 

Funny you should mention that, because the change from GTX to RTX parts to add ray tracing is the same from the Riva TNT series to the Geforce series which added hardware T&L.

https://developer.download.nvidia.com/assets/gamedev/docs/TransformAndLighting.pdf

image.thumb.png.b4e1e662988e7284565b59e033b9a5f2.png

Now look at Raytracing https://www.nvidia.com/en-us/geforce/news/gfecnt/geforce-gtx-dxr-ray-tracing-available-now/ :

image.thumb.png.ba14f91137d9396e1d83ed77b2759b56.png

 

Accelerating The Real-Time Ray Tracing Ecosystem: DXR For GeForce RTX and GeForce GTX

So what's changed here is processing from one logic block to another logic block on the GPU.

 

If Nvidia is dropping support for Pascal, that's likely because they've changed priorities to doing RT only on the RT logic, which Pascal and 16xx Turing doesn't have.

 

What will ultimately slow down the adoption of raytracing is AMD and Intel not supporting the same hardware feature set. Look at how CUDA has evolved. Intel and AMD haven't adopted CUDA, they instead built their own instructions, and then thrown them away multiple times, ceding the entire GPGPU leadership to Nvidia.

 

If someone is mad about Pascal, LET ME TELL YOU, did you know Intel discontinued ALL pre-Xe Graphics drivers? So that's all 10th gen and earlier, sorry if you bought a PC with no dGPU in 2020.

image.thumb.png.519dfd0165b147c0727fc9a461a15ec3.png

And then there is AMD:

https://www.anandtech.com/show/21126/amd-reduces-ongoing-driver-support-for-polaris-and-vega-gpus

So Radeon 600, and the iGPU's in Ryzen 2000G-5000G Bye Bye. 2017 Vega cards? Bye.

 

So AMD already discontinued their "Pascal" generation.

 

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648156
Share on other sites

Link to post
Share on other sites

20 minutes ago, Kisai said:

If someone is mad about Pascal, LET ME TELL YOU, did you know Intel discontinued ALL pre-Xe Graphics drivers? So that's all 10th gen and earlier, sorry if you bought a PC with no dGPU in 2020.

And then there is AMD:

https://www.anandtech.com/show/21126/amd-reduces-ongoing-driver-support-for-polaris-and-vega-gpus

So Radeon 600, and the iGPU's in Ryzen 2000G-5000G Bye Bye. 2017 Vega cards? Bye.

 

s8tkz.jpg

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648171
Share on other sites

Link to post
Share on other sites

1 hour ago, emosun said:
1 hour ago, DJ_Jay125 said:

Hell remember back then you were rich on the high end to use more than 4 cores and now even 6 core chips are considered budget.

My machine at the time was 12 cores , and it was 7 years old. You might need to do some brushing up on history.

Yes, and mine wasn't. Good argument, you see?

How does this counter his argument that 4 cores were kinda the high-end for consumers in 2016? Because all you are saying is that you had a baller system in 2016 (or 2018?), while most people on Intel (which was really stagnating leading up to that point) was only pushing 4/8 C/T processors, unless you went HEDT workstation. Ryzen launched in 2017? Before that, there was no need for intel to offer more than 4 cores in anything not high-end, and thus no impetus for devs/publishers to support or utilize those cores.

 

Pascal launched 8 years ago, which isn't too long in the grand scheme of things and I wish nVidia would continue support or at least provide enough resources to the open source community to allow them to provide important fixes/security updates.

 

However, it is obsolete hardware. Not because of performance or because it suddenly stopped working (and it will continue to work for the foreseeable future on current/old games), but because it lacks a feature the industry is pushing hard for and wants to establish as a new standard. This is pretty common practice and nothing particularly outrageous imho.

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648175
Share on other sites

Link to post
Share on other sites

17 minutes ago, GarlicDeliverySystem said:

How does this counter his argument that 4 cores were kinda the high-end for consumers in 2016?

because they had been making quad cores for over a decade by then. and core count has zero relation to speed.

Am i the oldest person here? do you guys think 2016 was in sepia tone and we all rode horses or something? jeez whats the point of even trying , just in one ear out the other.

17 minutes ago, GarlicDeliverySystem said:

Because all you are saying is that you had a baller system in 2016 (or 2018?), while most people on Intel (which was really stagnating leading up to that point) was only pushing 4/8 C/T processors

it was 7 years old and was the cheapest slowest used supermicro board available.

thats what i was saying. one of the cheapest xeon setups you could get at the time that wasn't any faster than their midrange lineup was 12 cores.

 

and i feel like i need to say this for anyone who may not be old enough to drive yet , core count had nothing to do with speed.

Having a 12 core xeon didn't mean you had a fast computer , it meant multicore pc's had existed for so long they were cheaper than intels midrange offerings at the time.
 

17 minutes ago, GarlicDeliverySystem said:

Before that, there was no need for intel to offer more than 4 cores in anything not high-end, and thus no impetus for devs/publishers to support or utilize those cores.

 

a 20 year old argument but go ahead an play the classics hits love hearing them.
 

17 minutes ago, GarlicDeliverySystem said:

but because it lacks a feature the industry is pushing hard for and wants to establish as a new standard. This is pretty common practice and nothing particularly outrageous imho.

and why support the users when we could support the industry right? i'll just copy paste for you what i did earlier

55 minutes ago, emosun said:

Nvidia employee of the month

 

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648185
Share on other sites

Link to post
Share on other sites

55 minutes ago, Kisai said:

Funny you should mention that, because the change from GTX to RTX parts to add ray tracing is the same from the Riva TNT series to the Geforce series which added hardware T&L.

https://developer.download.nvidia.com/assets/gamedev/docs/TransformAndLighting.pdf

image.thumb.png.b4e1e662988e7284565b59e033b9a5f2.png

Now look at Raytracing https://www.nvidia.com/en-us/geforce/news/gfecnt/geforce-gtx-dxr-ray-tracing-available-now/ :

image.thumb.png.ba14f91137d9396e1d83ed77b2759b56.png

 

Accelerating The Real-Time Ray Tracing Ecosystem: DXR For GeForce RTX and GeForce GTX

So what's changed here is processing from one logic block to another logic block on the GPU.

 

If Nvidia is dropping support for Pascal, that's likely because they've changed priorities to doing RT only on the RT logic, which Pascal and 16xx Turing doesn't have.

 

What will ultimately slow down the adoption of raytracing is AMD and Intel not supporting the same hardware feature set. Look at how CUDA has evolved. Intel and AMD haven't adopted CUDA, they instead built their own instructions, and then thrown them away multiple times, ceding the entire GPGPU leadership to Nvidia.

 

If someone is mad about Pascal, LET ME TELL YOU, did you know Intel discontinued ALL pre-Xe Graphics drivers? So that's all 10th gen and earlier, sorry if you bought a PC with no dGPU in 2020.

image.thumb.png.519dfd0165b147c0727fc9a461a15ec3.png

And then there is AMD:

https://www.anandtech.com/show/21126/amd-reduces-ongoing-driver-support-for-polaris-and-vega-gpus

So Radeon 600, and the iGPU's in Ryzen 2000G-5000G Bye Bye. 2017 Vega cards? Bye.

 

So AMD already discontinued their "Pascal" generation.

 

Super glad I randomly remembered the Riva TNT because you just made the argument I’ve been trying to make far more in depth than I could

 

like I wish we could continue forever and yes pascal has aged incredibly well but with the focus on RT and AI (especially now that Xbox and PS5 support RT and by the end of the year PS5 and probably Switch 2 support Ai upscaling) the industry is moving whether we all like it or not and with this movement supporting Pascal or non RT Turing just doesn’t make sense

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648191
Share on other sites

Link to post
Share on other sites

15 minutes ago, DJ_Jay125 said:

the industry is moving whether we all like it or not and with this movement supporting Pascal or non RT Turing just doesn’t make sense

.....for the richest company on earth.

look if you're here to shill nvidia just come out an say it  , we all gotta make a buck somehow least be honest
 

15 minutes ago, DJ_Jay125 said:

but with the focus on RT and AI

and nfts and crypto and (insert buzzword for people who fall for pyramid schemes here)

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648201
Share on other sites

Link to post
Share on other sites

2 hours ago, emosun said:
2 hours ago, GarlicDeliverySystem said:

How does this counter his argument that 4 cores were kinda the high-end for consumers in 2016?

because they had been making quad cores for over a decade by then. and core count has zero relation to speed.

Am i the oldest person here? do you guys think 2016 was in sepia tone and we all rode horses or something? jeez whats the point of even trying , just in one ear out the other.

Xeons were never really consumer chips, by that logic we should compare it to the 64 or 128 core CPUs in workstations and servers. Though it gets murky when you factor in Knight's Corner and Knight's Landing in this.

 

Anyway, from the last sentence in that quote I get the impression we are not dealing in good faith here, so I'll just skip the rest. Have a good day.

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648292
Share on other sites

Link to post
Share on other sites

15 minutes ago, GarlicDeliverySystem said:

Xeons were never really consumer chips

 

2 hours ago, emosun said:

it was 7 years old and was the cheapest slowest used supermicro board available.

 

yep so looks like

 

2 hours ago, emosun said:

just in one ear out the other

 

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16648302
Share on other sites

Link to post
Share on other sites

22 minutes ago, McCarthy said:

 

Other way around. AMD's quarterly sold market share dropped into the single digits and Intel is back to 0%. NVIDIA is now at over 90%.

 

No, I'm exactly correct, the PS5 (RDNA 2) and Xbox Series X (RDNA 2) has hardware raytracing. It doesn't have good RT, but do you think they would have won the contract for the PS5 if it didn't? You can not sell a GPU that does not do RT "now" because it would put it behind the capabilities of the game console, because it would put the console too far behind the curve of "where the puck is going". If Xbox and Playstation are going in this direction, and Nintendo is just not moving, you just ignore the switch. You are not going to add an entire extra "garbage graphics mode" just for it. Nope, non-RT mode is just cut and gone.

 

Intel can not win the PS6 or Xbox Series W(whatever) if there is no RT because it won't be able to to play games for the PS5/Xbox Series S.

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16650044
Share on other sites

Link to post
Share on other sites

3 hours ago, McCarthy said:

AMD shot themselves in the foot with their lackluster RT dev.

still thinking AMD is actually competing ? 😅

 

That's, besides, RT (effects) seem to work just fine on AMD, Nvidia has a slight edge, yes, like in almost everything else too. 

 

5 hours ago, McCarthy said:

the stats don't lie

not a single intel card? 😭

Sus...

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16650250
Share on other sites

Link to post
Share on other sites

6 hours ago, McCarthy said:

 

Your comments are now conflicting each other.

They are not. My point was that there is a reason why AMD and INTEL discontinued their non-RT hardware before Nvidia. Because they were late to that game, and they know they're not going sell new hardware without RT features. Be that customers for the PC, or console manufacturers.

 

6 hours ago, McCarthy said:

Also, this thread is about pascal GPUs, not consoles.

Doesn't matter, AMD sells more GPU's in consoles, and Intel is let out in the cold. Quite frankly it's amazing that Nvidia even bid on Nintendo's.

 

 

6 hours ago, McCarthy said:

AMD shot themselves in the foot with their lackluster RT dev.

I'm not saying they didn't. What I said was "feature set"

 

Going "I got RT too" and then only implementing 1 of 69420 features is what I refer to as "checklist/checkbox" feature sets. Saying you have the feature, and implementing it in a way that matters are completely different things.

 

It's like seeing a film and going "yep there's two women who talk to each other, bechdel test satisfied" and ignoring the fact that those characters get killed off screen 3 minutes later and now all the speaking characters are 6'9" white men. Just doing the bare minimum to not get torn into by reviewers, does not make a good product.

 

 

Link to comment
https://linustechtips.com/topic/1600517-possibly-hot-take/#findComment-16650391
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×