Jump to content

Nvidia might delay RTX 40 series on account of a 30 series market flood

Rym
20 minutes ago, Kisai said:

Cause anything below the "x60" tier on Geforce is a crap value, low performance, utterly setting your money on fire.

The 960 and 1060gb 3gb were both $199 parts, and even something like the 1050ti provided really great performance for most games for most people. 

 

Nowadays a x60 part costs more than a x70 used to cost (and inflation doesn't justify it, the pricing for the 2000 and 3000 series was way above the inflation during that period of time).

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/13/2022 at 7:01 AM, Arika S said:

People are generalised on this forum all the time, why is it only "gamers" that can't be generalised?

 

besides, pretty sure it was gamers calling for everyone; manufacturers, law makers, governments to put things in place so miners couldn't buy GPUs.

 

Disclaimer (since apparently you need it whenever you're not anti-miner): I don't mine, i'm not a miner, i hate cryptocurrency in all it's forms.

I think most gamers just wanted miners to stop buying a bunch of gpus. I mean you buy one I don't think anyone would care but if you are buying like 8 then it becomes a issue for supply as miners likely wouldn't put that much of a dent if all of them were limited to one. Honestly the biggest issue is probably the mining farms fie the most part as I am pretty sure nvidia was selling to them directly and screwing up supplies for everyone else. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Brooksie359 said:

Honestly the biggest issue is probably the mining farms fie the most part

I doubt so, specially since those were mostly common in asia.

10 minutes ago, Brooksie359 said:

as I am pretty sure nvidia was selling to them directly and screwing up supplies for everyone else. 

Also don't think that's the case, since availability was perfectly fine here, with all of the products were always available from day 0, the only problem being the high markup of the retailers since the MSRP isn't mandatory.

 

And even in places where supply was an issue, so it was for the PS5 and newest XBOX, people forget that we got thrown into a pandemic where most people got stuck at home and decided to upgrade their work/hobby platforms.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Kisai said:

Cause anything below the "x60" tier on Geforce is a crap value, low performance, utterly setting your money on fire. The mainline part is the x70 part if you have a 1080p60+ but below 4Kp60 performance target The x60 part is just barely cracks 1080p60 performance at present, but x50(ti) parts are pretty much 1080p non-gaming parts at this point.

You can't just go by tier. Generation makes a massive difference, so if you mean a specific part, name that specific part.

 

Also I feel your performance demands are far higher than I find in practice. As a user of a 3070 4k system, it is overkill for 1080p unless you're an insane fps type, and borderline 4k60 Ultra class. That depends a lot on the game. Some make it, some fall slightly short, but with G-sync it doesn't matter. I also still use a 2070 (roughly a 3060) on a 1440p system and it has no trouble driving that with high+ settings. 3050 (equiv. to my old 1070) would be plenty for most on 1080p.

 

My laptop has a mobile 3070 with 1080p display and basically I never have to worry about dropping below 60fps. Even tried throwing (DL)DSR on it to try to use up more power but it is basically limited by the low resolution display and not worth it to me.

 

28 minutes ago, Kisai said:

Note how the 30 series 60Ti and 70 are the only 30 series parts with a "good value"

Looking at perf/$ for a GPU by itself is basically meaningless for a gamer. You can't game on the GPU by itself. It is better but still not perfect to do a perf/$ for the whole system. Realistically there are two key requirements that tries to get satisfied: good enough performance for the chosen display, and within budget. One or other may dominate. 

 

Perf/$ of GPU can start to make some sense for compute use cases where you run multiple GPUs with minimum support around it. It could be folding, mining or whatever else.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, igormp said:

I doubt so, specially since those were mostly common in asia.

Also don't think that's the case, since availability was perfectly fine here, with all of the products were always available from day 0, the only problem being the high markup of the retailers since the MSRP isn't mandatory.

 

And even in places where supply was an issue, so it was for the PS5 and newest XBOX, people forget that we got thrown into a pandemic where most people got stuck at home and decided to upgrade their work/hobby platforms.

Idk where you live but where I was the only option to get the cards was from scalpers. Maybe you could get lucky and get one right as they restocked but it was super hit or miss. AMD cards had were pretty much always available even if they were a but more than their msrp. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/13/2022 at 7:01 AM, Arika S said:

People are generalised on this forum all the time, why is it only "gamers" that can't be generalised?

 

besides, pretty sure it was gamers calling for everyone; manufacturers, law makers, governments to put things in place so miners couldn't buy GPUs.

 

Disclaimer (since apparently you need it whenever you're not anti-miner): I don't mine, i'm not a miner, i hate cryptocurrency in all it's forms.

I think the problem with generalizing gamers in this case are a lot gamers were just tired of miners buying up graphics cards in bulk, and yet the blame goes to gamers for wanting to buy a single card so I can see why some might get upset. The cryptomining hype is a main reason why GPU's have gotten ridiculously expensive with no budget price to performance option in the $200-250 range, no point in offering any lower tier products when miners bought up every GPU.  I find it a bit ridiculous for anyone to ask law makers to enforce who can buy a GPU, governments and lawmakers don't care, and Nvidia only got a $5.5 million dollar fine for lying about the mining demand from 2017. Also I didn't say anyone was a miner here.

24 minutes ago, igormp said:

I doubt so, specially since those were mostly common in asia.

Nvidia selling to mining farms would be even more profitable as the GPU's are made in Taiwan and China, no need to ship them anywhere else if most of the supply is going to mining farms.

28 minutes ago, igormp said:

Also don't think that's the case, since availability was perfectly fine here, with all of the products were always available from day 0, the only problem being the high markup of the retailers since the MSRP isn't mandatory.

 

And even in places where supply was an issue, so it was for the PS5 and newest XBOX, people forget that we got thrown into a pandemic where most people got stuck at home and decided to upgrade their work/hobby platforms.

If you live somewhere there isn't much demand for gpu's or the prices are always high there wouldn't be much demand from scalpers or miners, during the peak of the mining hype you couldn't buy a card anywhere except from a scalper, or if you have a Microcenter and waited in line.

I can understand supply for consoles being an issue but I doubt people were rushing out to buy an RTX 3080 or 3090 for their work system.

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, igormp said:

The 960 and 1060gb 3gb were both $199 parts, and even something like the 1050ti provided really great performance for most games for most people. 

 

Nowadays a x60 part costs more than a x70 used to cost (and inflation doesn't justify it, the pricing for the 2000 and 3000 series was way above the inflation during that period of time).

 

Just pointing out the obvious, the problem here is that people assume nvidia is pricing by tier, rather than by the performance. Look at any nvidia part since, oh the GTX 700 series, and the replacement part in the same "high end" tier would almost double, but the part at the x50/x60 tier would stay the same.

 

image.thumb.png.5a9776d0bd15860e7aa032561b0a7eef.png

The spread between the 3050 and the 2050 is double. The spread between the 2050 and the 2050/1050/950 is  within 5%, and then the 750 is 37% below the 950. Yes I'm aware that 2050 part is listed as mobile. That's a 73.5% change in performance if you went straight from the 750 to the 3050.

 

image.thumb.png.7174f2cf7107654a0a84d823121e2061.png

Now here, the difference between the 3060 and 2060 is 17%. There is 28% between the 2060 and 1060, 40% between the 1060 and 960, and 21% between the 960 and 760. That's a 71.9% change in performance if you went straight from a 760 to a 3060.

 

The worst gain is between the 2060 and 3060

 

image.thumb.png.052cb81cc0cd2d76793bad5add9e53d4.png

 

Again, 27.5% spread between the 3070 and 2070, 73.5% between the 3070 and the 770. The same spread as the x50 parts. So if you in fact were upgrading from a 770, to a 3070, the performance gain would be the same as the 750 to 3050. The 2070 from 1070, 16%, 1070 from 970 28.3%, 970 from 770, 39.1%

 

Yet if you upgraded EVERY card in between the worst gain would have been the 20 series.

 

image.thumb.png.bdcdd0ce93900027072f292f4e545760.png

Well, look at that, the worst value for upgrading after the x60's at 68%. 3080 from 2080, 25.3%, 2080 from 1080 18.3%, 1080 from 980 26.5%, 980 from 780 28.5%

 

And we can't look at the 90 parts, because the 3090 can only be compared to the Titan models, which were not really marketed as gaming cards.

 

So as a linear upgrade, every upgrade in the x80 offered 25% over the previous except upgrading to the 20 series at 18.3%. Where as the 70 parts ranged from 18% to 40%. By all accounts no matter which part you picked, the worst value upgrade was the 20xx parts from the 10xx parts. 

 

Yet, when should you really upgrade? Those 30 upgrade more than tripled the performance, but only if you were replacing an obsolete GTX 7xx series. Note that the 7xx (GM10x) and 9xx(GM20x) parts are both Maxwell parts. They would have only been a straight doubling of the performance if you go straight from the 9xx part to the 30xx part.

 

 

6 hours ago, porina said:

You can't just go by tier. Generation makes a massive difference, so if you mean a specific part, name that specific part.

 

Also I feel your performance demands are far higher than I find in practice. As a user of a 3070 4k system, it is overkill for 1080p unless you're an insane fps type, and borderline 4k60 Ultra class

You absolutely can, because that's what the price/performance tier is marketed as. To go from a 720p to 1080p to 4k, requires each time a 4x increase in performance and video memory. However half-steps like 1440p, which is why a part like the x70 might fit better than a x80.

 

Games aren't static. The performance demands track the same. If you're buying a new game using the UE5 engine, or even a game that started being developed 10 years ago on UE4, you're going to end up needing a 3080 to run in Ultra at 4K. Even games like Fortnite put this level of demand on the GPU.

 

Historically nVidia has always looked like "the new part is equal to the previous tier up" So let's see if this is true.

image.thumb.png.c51065f9274e53a6ce577d824d268682.png

Checks out (turns out the product name gap between Maxwell and Pascal is actually wider)

image.thumb.png.d9c231b7f976aa3c82e8dd9222857e76.png

Also checks out.

 

So theoretically, the price should be linear through the tiers right? is the MSRP of a 3070(499) equal to the MSRP of the 2080 (699)? No. Is the 3060(329), 2070(499), and 1080(599) ? No.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Brooksie359 said:

Idk where you live but where I was the only option to get the cards was from scalpers. Maybe you could get lucky and get one right as they restocked but it was super hit or miss. AMD cards had were pretty much always available even if they were a but more than their msrp. 

Then I'd say that scalpers were the real problem, not miners, otherwise miners would've bought from those scalpers too at the sime and no one would even see a GPU for sale at all.

2 hours ago, Blademaster91 said:

Nvidia selling to mining farms would be even more profitable as the GPU's are made in Taiwan and China, no need to ship them anywhere else if most of the supply is going to mining farms.

If that were the case, the rest of the world wouldn't see any GPUs at all either.

 

2 hours ago, Brooksie359 said:

Idk where you live but where I was the only option to get the cards was from scalpers. Maybe you could get lucky and get one right as they restocked but it was super hit or miss. AMD cards had were pretty much always available even if they were a but more than their msrp. 

AMD GPUs had worse availability here actually, but that's most due to the fact that it doesn't have such a good reputation as nvidia locally. Scalping just wasn't as bad of a problem here, but we still had plenty of miners due to the cheap electricity at the time (and it still didn't mess up with the stock availability.

 

2 hours ago, Blademaster91 said:

If you live somewhere there isn't much demand for gpu's or the prices are always high there wouldn't be much demand from scalpers or miners, during the peak of the mining hype you couldn't buy a card anywhere except from a scalper, or if you have a Microcenter and waited in line.

I can understand supply for consoles being an issue but I doubt people were rushing out to buy an RTX 3080 or 3090 for their work system.

There is demand, but yes, the price was pretty high already to begin with. Companies started selling close to the MSRP at launch, then kept increasing the prices until it got twice as expensive in less than a couple months, so there was no reason for scalpers to try to make a profit lol

Do you really believe all of those people on the line for Microcenter were miners?

Also, many companies paid their WFH employees money to make upgrades onto their work stations. I did and got some new stuff.

 

The number of GPUs sold also wasn't really that high to begin with, with early years selling more, and right at the start of the pandemic, but before the mining boom, we already had signs of a demand peak:

Coovo83KdvnouaADuJgygM-970-80.png

Source

 

Don't get me wrong, I'm against cryptomining and find it dumb as hell, but blaming mining for a supply shortage (which happened in MANY areas, including automotive) is just being an entitled gamer who wants to point the finger at some bad guy.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, igormp said:

Then I'd say that scalpers were the real problem, not miners, otherwise miners would've bought from those scalpers too at the sime and no one would even see a GPU for sale at all.

If that were the case, the rest of the world wouldn't see any GPUs at all either.

 

AMD GPUs had worse availability here actually, but that's most due to the fact that it doesn't have such a good reputation as nvidia locally. Scalping just wasn't as bad of a problem here, but we still had plenty of miners due to the cheap electricity at the time (and it still didn't mess up with the stock availability.

 

There is demand, but yes, the price was pretty high already to begin with. Companies started selling close to the MSRP at launch, then kept increasing the prices until it got twice as expensive in less than a couple months, so there was no reason for scalpers to try to make a profit lol

Do you really believe all of those people on the line for Microcenter were miners?

Also, many companies paid their WFH employees money to make upgrades onto their work stations. I did and got some new stuff.

 

The number of GPUs sold also wasn't really that high to begin with, with early years selling more, and right at the start of the pandemic, but before the mining boom, we already had signs of a demand peak:

Coovo83KdvnouaADuJgygM-970-80.png

Source

 

Don't get me wrong, I'm against cryptomining and find it dumb as hell, but blaming mining for a supply shortage (which happened in MANY areas, including automotive) is just being an entitled gamer who wants to point the finger at some bad guy.

You are joking right? We are currently seeing a huge oversupply of gpus as a direct result of cryptocurrency crashing. This is even caused nvidia to want to lessen the amount of fab allocation from tsmc by a significant amount for their next gen chips as a direct result of the cryptocurrency crash. Nvidia is even getting sued for not disclosing just how significant the cryptocurrency mining played a part in their record sales. I'm sorry but you don't get sued for not disclosing just how much gpus were sold to miners unless it's a large amount. If you think after looking at all of those facts that cryptocurrency wasn't one of the biggest causes of the shortage then I don't know what to tell you as as soon as cryptocurrency became unprofitable we no longer have a supply shortage imagine that. 

Link to comment
Share on other sites

Link to post
Share on other sites

19 hours ago, MageTank said:

That said, lets humor your logic here. If I, as a gamer am going to call into question the moral implications of minings impact on the environment, is my lesser impact automatically free from the same scrutiny because the scale is smaller? Am I not to be seen as hypocritical by supplementing my disdain towards miners by using a moral stance, when I myself am taking no additional measures to appease the moral concern I am referencing?

The entire argument of "gaming causes less damage so we shouldn't hate it" also hinges on the idea that gaming uses less power, which we don't know for sure. 

 

Sure, 1 GPU used for gaming for let's say 5 hours a day might not use as much power as 1 GPU used for mining 24 hours a day, but what if there are 5 times more gamers than miners? All of a sudden it's 5 GPUs running for 5 hours a day each for a total of 25 hours a day, VS one mining GPU running for 24 hours. 

 

The anti-mining gamer crowd likes to think that like 80% of GPUs sold are sold to miners, but all facts and professional estimates we got so far indicates that it's more like 10% of GPUs being sold to mining. 

 

What is most likely to use the most power, 1 GPU used for mining or 9 GPUs used for gaming? Assuming those numbers are accurate of course. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Brooksie359 said:

You are joking right? We are currently seeing a huge oversupply of gpus as a direct result of cryptocurrency crashing.

I still dare to say it's mostly regular people looking to upgrade for the next gen, the same thing ALWAYS happens when it's close to a new gen release.

 

4 hours ago, Brooksie359 said:

ou as as soon as cryptocurrency became unprofitable we no longer have a supply shortage imagine that. 

Supply was coming back to normal even before the actual crash. As soon as the new models rumors started to come out, prices started to drop and people started to sell their current GPUs, unlike the 2018 crash, when mining crypto actually became unprofitable (it's still profitable, fwiw).

 

4 hours ago, Brooksie359 said:

If you think after looking at all of those facts that cryptocurrency wasn't one of the biggest causes of the shortage then I don't know what to tell you as as soon as cryptocurrency became unprofitable we no longer have a supply shortage imagine that. 

Do you even know what you're talking about? That fine was for what happened in 2018, and mostly related to the sales in China.

 

Even in the graph I sent before you can clearly see that the amount of GPUs sold wasn't even as close as 2017~2018, even though we had a freaking pandemic with people stuck at home with some form of stimulus check who wanted to upgrade their play/workstations.

 

Again, most of the scenario you're making up in your mind is what used to happen in China, with big mining places and tons of GPUs. In other places what you usually see is a handful of people who build a single room with 10~20 GPUs, some more that have 2~4 GPUs in a single PC that they also use, and I bet most of the miners outside of east Asia are actually people with a single GPU who wanted to ease the cost of their purchase.

 

Unless you actually come up with data backing up your claims about the amount of GPUs that went to miners, and not gamers or professionals, I guess that there's nothing else left for us to discuss.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, LAwLz said:

as a gamer am going to call into question the moral implications of minings impact on the environment

Our civilization invests 2.5% of CO2 emissions into shipping, which I believe is justified due to the utility it provides (efficiently move cargo around).
Our civilization invests into fireworks and christmas decorations, which I believe is justified due to the utility they provide (entertainment and tradition).
Data centers cause 2% of global emissions, which I believe is justified due the utility they provide (digital data, digital services, telecom)

Personally, I see blockchain as a negative sums game (people invested in blockchain, universally care about token/USD$ exchange rate, and blockchain can only shift USD$ around minus electricity costs), an unregulated lottery. I believe 0.5% of CO2 emissions are not worth the utility of a lottery. You can run a lottery for a vastly inferior emission footprint.


I haven't find a measure of GPU emission for gaming, I would speculate they are comparable with crypto mining. I believe is justified due the utility they provide (entertainment).

My position on using a GPU for mining blockchain, vs using GPU for gaming is clear. I believe gaming GPU are a justified use of resources, I believe mining GPU are not a justified use of resources. As a gamer, and someone with relatives and collegues that lost money to blockchain frauds, my opinion is clearly biased.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, 05032-Mendicant-Bias said:

My position on using a GPU for mining blockchain, vs using GPU for gaming is clear. I believe gaming GPU are a justified use of resources, I believe mining GPU are not a justified use of resources. As a gamer, and someone with relatives and collegues that lost money to blockchain frauds, my opinion is clearly biased.

This operates under the assumption that miners also do not find entertainment in mining. I know a few miners that simply enjoy mining and trading as a hobby. It's fun for them, they enjoy it, so they devote their free time and excess resources to it. By your logic, since they are using it for entertainment, it is "justified", is it not?

 

Who are we to dictate what is or isn't "entertainment" to others? Now do you see a problem with this one sided, subjective justification?

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, MageTank said:

 

Who are we to dictate what is or isn't "entertainment" to others? Now do you see a problem with this one sided, subjective justification?

 

It's an objective observation that mining has an overall negative impact on the environment, resulting in coal power stations being spun back up (or being operated by) just for mining.

 

That is not the case with gaming. Overwatch 2 doesn't come out and suddenly electricity prices spike as everyone and their dog is on it.

 

Places that have problematic unregulated power like Texas, mining can directly result in people dying when power cuts aren't enough in peak air conditioning. 

https://www.theverge.com/2022/7/12/23205066/texas-heat-curtails-bitcoin-mining-energy-demand-electricity-grid

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Brooksie359 said:

We are currently seeing a huge oversupply of gpus as a direct result of cryptocurrency crashing.

Sudden supply flood isn't really the same thing as supply over time though. If Nvidia withheld all supply of RTX 30 series cards for 12 months and then release the entire production backlog all at once it would seem like a huge over supply when in reality there is zero difference in supply, only period of availability.

 

Be weary of short term events.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, igormp said:

I still dare to say it's mostly regular people looking to upgrade for the next gen, the same thing ALWAYS happens when it's close to a new gen release.

 

Supply was coming back to normal even before the actual crash. As soon as the new models rumors started to come out, prices started to drop and people started to sell their current GPUs, unlike the 2018 crash, when mining crypto actually became unprofitable (it's still profitable, fwiw).

 

Do you even know what you're talking about? That fine was for what happened in 2018, and mostly related to the sales in China.

 

Even in the graph I sent before you can clearly see that the amount of GPUs sold wasn't even as close as 2017~2018, even though we had a freaking pandemic with people stuck at home with some form of stimulus check who wanted to upgrade their play/workstations.

 

Again, most of the scenario you're making up in your mind is what used to happen in China, with big mining places and tons of GPUs. In other places what you usually see is a handful of people who build a single room with 10~20 GPUs, some more that have 2~4 GPUs in a single PC that they also use, and I bet most of the miners outside of east Asia are actually people with a single GPU who wanted to ease the cost of their purchase.

 

Unless you actually come up with data backing up your claims about the amount of GPUs that went to miners, and not gamers or professionals, I guess that there's nothing else left for us to discuss.

You could make a fair guess at how many went to mining based on used gpu sales specifically ones with multiple gpus for sale. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kisai said:

 

It's an objective observation that mining has an overall negative impact on the environment, resulting in coal power stations being spun back up (or being operated by) just for mining.

 

That is not the case with gaming. Overwatch 2 doesn't come out and suddenly electricity prices spike as everyone and their dog is on it.

 

Places that have problematic unregulated power like Texas, mining can directly result in people dying when power cuts aren't enough in peak air conditioning. 

https://www.theverge.com/2022/7/12/23205066/texas-heat-curtails-bitcoin-mining-energy-demand-electricity-grid

 

 

Thank you for sharing this. Now where exactly does it have anything to do with what I have been saying? 

 

This is part of the problem I have with this rhetoric. Even when we remove the moral discussion from the equation and make it about ones personal views of what is or isn't entertainment, someone gives in ignoring the context of the conversation to make it a moral dilemma yet again.

 

Still, since I am still me, I'll humor your post again.

 

You are correct, it is an objective observation that mining has a negative impact on the environment. Nowhere in ANY of my posts have I attempted to refute this. On this specific point, you and I are in complete agreeance. Now lets go revisit Past MageTank again because that guy is an absolute wealth of knowledge when I need information:

On 7/14/2022 at 11:25 AM, MageTank said:

The logic isn't flawed at all. You cannot make it a moral issue, then claim moral superiority because your hobby is less harmful. The issue is that similar analogies would be viewed as a strawman which I refuse to do. Much like comparing this to an apples & oranges or false equivalence fallacy would technically also fall into that category. The irony of the situation is that this very topic is a fallacy in and of itself, specifically suppressed correlative. If my argument is that both gaming and mining are bad for the environment, you can't simply counter this claim by saying "gaming is less harmful, so this isn't relevant". You are not refuting my claim that both are harmful, you are attempting to justify your opinion in the matter as a means of supplementing what you believe is the moral or logical outcome.

 

That said, lets humor your logic here. If I, as a gamer am going to call into question the moral implications of minings impact on the environment, is my lesser impact automatically free from the same scrutiny because the scale is smaller? Am I not to be seen as hypocritical by supplementing my disdain towards miners by using a moral stance, when I myself am taking no additional measures to appease the moral concern I am referencing?

 

I am not arguing that one side is more environmentally impactful than the other.  I am simply arguing that people cannot claim moral superiority by using this as an excuse as to why gamers "deserve" the cards more, when both sides are objectively bad for the environment. If you throw more apples and oranges at me, I'll start throwing watermelons back.

Thank you, Past MageTank. Ever insightful as always. 

 

So you can see from that post that I was never arguing scale. @LAwLzbrought up a very interesting point, but its not a point I intend to argue because I do not have the facts to corroborate the claims nor do I care enough to invest that amount of time as it does nothing for my point. My entire point has always existed around the concept of people being hypocrites in their arguments against mining when they themselves are doing nothing to curb the emissions from their hobbies. It's the people that point fingers towards people they don't like while simultaneously proclaiming "I am not as bad so you can't be mad at me!". If the gamers want me to be sympathetic towards their cause, stop grasping for moral highground when they themselves refuse to adjust their lifestyles for the very cause they are attempting to champion. Just come out and say that you dislike mining for whatever personal reason you have and that you would prefer those cards be used for your preferred application. Is it a subjective opinion? Absolutely, but at the very least it doesn't come off as pretentious.

 

Let's also avoid using life and death against mining as an excuse for it being "objectively bad". Plenty of gamers dying from 24 hour streams.

https://www.cnn.com/2015/01/19/world/taiwan-gamer-death/index.html

https://www.washingtonpost.com/news/morning-mix/wp/2017/02/23/va-man-died-during-marathon-attempt-to-play-video-game-for-24-hours/

https://venturebeat.com/2015/03/05/man-dies-after-19-hour-world-of-warcraft-session/

 

If we are out here trying to save lives, might as well ban all of the fun stuff while we are at it.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, LAwLz said:

The entire argument of "gaming causes less damage so we shouldn't hate it" also hinges on the idea that gaming uses less power, which we don't know for sure. 

Nope, we know for sure.

5 hours ago, LAwLz said:

Sure, 1 GPU used for gaming for let's say 5 hours a day might not use as much power as 1 GPU used for mining 24 hours a day, but what if there are 5 times more gamers than miners?

image.png.bb5a485b4bf9f56ffc05fd03a67bf569.png

5 hours ago, LAwLz said:

All of a sudden it's 5 GPUs running for 5 hours a day each for a total of 25 hours a day, VS one mining GPU running for 24 hours.

Classic fallacy of relative privation. "As long as all gamers combined use more power than mining, mining isn't bad".

Considering most miners don't run a single GPU but multiple, the per-capita energy usage is even higher than your napkin maths implies.

5 hours ago, LAwLz said:

What is most likely to use the most power, 1 GPU used for mining or 9 GPUs used for gaming? Assuming those numbers are accurate of course. 

It's the wrong question and falls under the fallacy of relative privation. You should rather ask: "Who is using more power, a miner or a gamer?"

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, igormp said:

The 960 and 1060gb 3gb were both $199 parts, and even something like the 1050ti provided really great performance for most games for most people. 

 

Nowadays a x60 part costs more than a x70 used to cost (and inflation doesn't justify it, the pricing for the 2000 and 3000 series was way above the inflation during that period of time).

Nothing justifies it but inflation is way over what you are stating.  Real numbers are like 30-50% in the last few years.  We have certain food staples (that we need to sell products) that are up 75+% (3.99 vs 6.19 now)!

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

39 minutes ago, HenrySalayne said:

You should rather ask: "Who is using more power, a miner or a gamer?"

Why? 

 

If the argument is "mining is bad because it destroys the environment" then we need to compare mining as a whole vs gaming as a whole. 

 

The envioument does not care if 1GWh is used for gaming or mining, nor does it care how many individuals used up that 1GWh. 

 

I don't understand why you think it is more okay to use up 1GWh of power if it's multiple people doing it. It still uses up the same amount of natural resources to generate that energy. 

 

The ozone layer doesn't heal just because you tell it "don't worry, the hole causes by the burning of coal went to power 10 gaming rigs, not one mining rig". 

Link to comment
Share on other sites

Link to post
Share on other sites

So MLID is now rumoring the complete opposite of what he said just a few days ago:

Nvidia GeForce RTX 4090 reportedly no longer delayed

https://www.pcgamesn.com/nvidia/RTX-4090-GPU-october-arrival

 

Which - of course - is headline speak for "the 4000 series was never delayed in the first place, the first rumor was complete bollocks". In particular his dates now line up with ones rumored by wccftech over a month ago - 4090 in October, 4080 in November and 4070 in December.

 

A lot of his video is spent talking about a "delay to the 600W full-fat AD102 GPU" - a 4090Ti if you will - saying that this is so they can keep it in reserve as a counter to AMD. But of course, anyone who's followed Nvidia's GPU launches will know that this is complete bollocks as well, and is instead a pattern we've seen for years. The full-fat XX102 gaming card is always released later, because initially yields on that chip are very low and the best chips are first reserved for quadro products (which use full-fat 102 chips, despite being released at about the same time as the initial release) - we can see this by looking at the A6000 and the 3090. This isn't "news" - this is just how Nvidia's product cycle works.

 

*Adds evidence to pile of reasons why anything MLID says should be taken with a mountain of salt*.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, ewitte said:

Nothing justifies it but inflation is way over what you are stating.  Real numbers are like 30-50% in the last few years.  We have certain food staples (that we need to sell products) that are up 75+% (3.99 vs 6.19 now)!

Even considering a 30~50% inflation, the actual value should still be bellow the MSRP of the 2000 and 3000 series at launch, specially the 2000 series, which was before all that mess. 

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, tim0901 said:

So MLID is now rumoring the complete opposite of what he said just a few days ago

He'll just release tons of different videos, each one with a different 4000 launch month, so he'll be right at some point

 

/s

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tim0901 said:

So MLID is now rumoring the complete opposite of what he said just a few days ago:

Nvidia GeForce RTX 4090 reportedly no longer delayed

https://www.pcgamesn.com/nvidia/RTX-4090-GPU-october-arrival

 

Which - of course - is headline speak for "the 4000 series was never delayed in the first place, the first rumor was complete bollocks". In particular his dates now line up with ones rumored by wccftech over a month ago - 4090 in October, 4080 in November and 4070 in December.

 

A lot of his video is spent talking about a "delay to the 600W full-fat AD102 GPU" - a 4090Ti if you will - saying that this is so they can keep it in reserve as a counter to AMD. But of course, anyone who's followed Nvidia's GPU launches will know that this is complete bollocks as well, and is instead a pattern we've seen for years. The full-fat XX102 gaming card is always released later, because initially yields on that chip are very low and the best chips are first reserved for quadro products (which use full-fat 102 chips, despite being released at about the same time as the initial release) - we can see this by looking at the A6000 and the 3090. This isn't "news" - this is just how Nvidia's product cycle works.

 

*Adds evidence to pile of reasons why anything MLID says should be taken with a mountain of salt*.

In other words 7900XT/4090 in Oct and 7950XT/4090Ti in July/July for the double-dippers 🙂

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, LAwLz said:

If the argument is "mining is bad because it destroys the environment" then we need to compare mining as a whole vs gaming as a whole. 

You still cling to the fallacy of relative privation.

Mining is bad for the environment. PERIOD.

That gaming also has an environmental impact is an entire discussion on its own.

59 minutes ago, LAwLz said:

I don't understand why you think it is more okay to use up 1GWh of power if it's multiple people doing it.

By definition a resource is limited, so it should be equitably allocated. This is a basic ethical principle.
 

1 hour ago, LAwLz said:

The ozone layer doesn't heal just because you tell it "don't worry, the hole causes by the burning of coal went to power 10 gaming rigs, not one mining rig". 

I just gave you a graph showing the average time people spend gaming per week. One mining GPU equals more than 20 gaming rigs.

Nevertheless, your only argument for mining is "other things are also bad or even worse" which is still the fallacy of relative privation and not a justification for anything.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×