Jump to content

Crypto miners are hoarding early shipments of RTX 3080s

2 minutes ago, CarlBar said:

 

Is that by volume or by profit though? Server parts tend to be very high margin, but as @leadeater pointed out to me recently not high volume.

Actual revenue, see:

spacer.png

 

Q1 2020 is available out there, I'm just too lazy to search.

 

And AMD just recently managed to get traction on the server side with EPYC (from product launch to actual sales for DCs it may take well over a year due to homologation), so most of their revenue still comes from desktops (consoles are high volume but low margins, not sure how much it represents for them).

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, igormp said:

Actual revenue, see:

spacer.png

 

Q1 2020 is available out there, I'm just too lazy to search.

 

And AMD just recently managed to get traction on the server side with EPYC (from product launch to actual sales for DCs it may take well over a year due to homologation), so most of their revenue still comes from desktops (consoles are high volume but low margins, not sure how much it represents for them).

 

Achh, that doesn't help much. Revenue can, (and from what i've seen usually does), just mean total income before costs are deducted. Which isn't very useful when you've got a 2070 using a TU104 die going for 500-600USD and a Quadro 5000 using a variant of the same die going for 5,000USD. The quadro costs nearly 10 times as much and absolutely has over 10 times the margin meaning even if gaming brings in a lot of raw cash, the actual profit from it will be much smaller. You need actual raw profit numbers, not just total cashflow.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Blademaster91 said:

I don't know how accurate the steam hardware survey is, but I'm pretty sure it's a voluntary survey, although I don't doubt most people are buying cards in the $200-300 price range, not in the $700 budget as you can just build a whole PC for a $700, or get a console and a TV for that much. And people with $700 sitting around should have another hobby or a console they can play games on if they sold their GPU.

I wonder how many people are fed up with Nvidia increasing prices since the RTX 2000 series, the x80 card should be $550-600 not "starting at" $700, with AIB models being $50-100 more.

Steam did the voluntary survey a few days ago. 

 

Accuracy probably depends if it's looking at all GPU's or only GPU's used as render devices, because Intel laptops always have the intel GPU active with the dGPU never used unless something invokes it, where as a desktop system may have the iGPU enabled for various reasons. The System Info in steam sees my Geforce GTX 1080 first and omits the iGPU.

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, CarlBar said:

 

Achh, that doesn't help much. Revenue can, (and from what i've seen usually does), just mean total income before costs are deducted. Which isn't very useful when you've got a 2070 using a TU104 die going for 500-600USD and a Quadro 5000 using a variant of the same die going for 5,000USD. The quadro costs nearly 10 times as much and absolutely has over 10 times the margin meaning even if gaming brings in a lot of raw cash, the actual profit from it will be much smaller. You need actual raw profit numbers, not just total cashflow.

Well, nvidia doesn't show profit numbers by segment. But anyway, even if the margins are lower (which they sure are), still shows how much they sell and how they are dominant in that market.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, CarlBar said:

 

Achh, that doesn't help much. Revenue can, (and from what i've seen usually does), just mean total income before costs are deducted. Which isn't very useful when you've got a 2070 using a TU104 die going for 500-600USD and a Quadro 5000 using a variant of the same die going for 5,000USD. The quadro costs nearly 10 times as much and absolutely has over 10 times the margin meaning even if gaming brings in a lot of raw cash, the actual profit from it will be much smaller. You need actual raw profit numbers, not just total cashflow.

Only if the same number of GPUs are sold. $100 profit on 100 GPUs and $1 profit on 10,000 GPUs is the same.  This is why Ford makes 1,000 horsepower mustangs.  Not to sell  300 1,000 horsepower mustangs, but to sell 100,000 v6 mustangs.  And 300,000 focuses.  The money is often in volume. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bombastinator said:

Only if the same number of GPUs are sold. $100 profit on 100 GPUs and $1 profit on 10,000 GPUs is the same.  This is why Ford makes 1,000 horsepower mustangs.  Not to sell  300 1,000 horsepower mustangs, but to sell 100,000 v6 mustangs.  And 300,000 focuses.  The money is often in volume. 

 

The problem is in those graphs the gaming and non-gaming revenues aren't that far apart. Both of those values are cost agnostic.

 

Lets take the highest figures on that graph. Q1 2019.I'd estimate the total revenue to fall around 3250, and the gaming around 1750. That means non-gaming works out around 1500.

 

Lets say the average % of sale price cost to NVIDIA for a gaming GPU is around 25%, (this is probably an underestimate for low end but an overestimate for the high end, i suspect low on average but going for a figure we think is too low is good and i just need a non-outrageous example to make my point). Lets also say the cost as a percentage of the total GPU value are 4 times less for the server side, (absolutely a huge underestimate). That means % of sale price is around 6.25% cost.

 

Since this is total income if we deduct the % of sale price that is cost we can work out the net profit. A quick bit of maths says this puts gaming at around 1312, and non gaming at 1406 profit. The more the cost goes up on the gaming side and the more the margins go up on the non-gaming side the bigger that difference is going to get, (if we go with a 5 times margin on non-gaming and a 30% cost on gaming this shifts the figures to 1225 vs 1410).

 

And thats before we come to the question of what they're including in gaming. Is G-Sync modules and HDR certification being included in that, i bet that makes them a pretty penny. Also Nintendo Switch is in that? Probably a few other things i'm not thinking of they might be including in there.(OFC non-gaming could include their recently acquired Networking stuff as well).

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, rrubberr said:

Ha. Or sell 0. 

Low rollers like to look sharp too.  Build only good ones or build a couple reaalllly good ones, market the heck out of them, and then sell a whole bunch of low ball ones because perception has gotten fuzzed.  It’s the whole concept behind halo products.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, CarlBar said:

 

The problem is in those graphs the gaming and non-gaming revenues aren't that far apart. Both of those values are cost agnostic.

 

Lets take the highest figures on that graph. Q1 2019.I'd estimate the total revenue to fall around 3250, and the gaming around 1750. That means non-gaming works out around 1500.

 

Lets say the average % of sale price cost to NVIDIA for a gaming GPU is around 25%, (this is probably an underestimate for low end but an overestimate for the high end, i suspect low on average but going for a figure we think is too low is good and i just need a non-outrageous example to make my point). Lets also say the cost as a percentage of the total GPU value are 4 times less for the server side, (absolutely a huge underestimate). That means % of sale price is around 6.25% cost.

 

Since this is total income if we deduct the % of sale price that is cost we can work out the net profit. A quick bit of maths says this puts gaming at around 1312, and non gaming at 1406 profit. The more the cost goes up on the gaming side and the more the margins go up on the non-gaming side the bigger that difference is going to get, (if we go with a 5 times margin on non-gaming and a 30% cost on gaming this shifts the figures to 1225 vs 1410).

 

And thats before we come to the question of what they're including in gaming. Is G-Sync modules and HDR certification being included in that, i bet that makes them a pretty penny. Also Nintendo Switch is in that? Probably a few other things i'm not thinking of they might be including in there.(OFC non-gaming could include their recently acquired Networking stuff as well).

 

 

There’s a saying: “ there are lies, damn lies, and statistics”. They weren’t my graphs.  I don’t even know who created them.  I was replying to another post.  If the graphs are misleading that’s how graphs often are.  Graphs are great for visualizing data and can reveal things.  They can also be created to misrepresent though.

there used to be this great web site called “data is beautiful” but it went dead when Covid hit.
 I don’t even know what the graphs are referring to.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, CarlBar said:

Achh, that doesn't help much. Revenue can, (and from what i've seen usually does), just mean total income before costs are deducted. Which isn't very useful when you've got a 2070 using a TU104 die going for 500-600USD and a Quadro 5000 using a variant of the same die going for 5,000USD. The quadro costs nearly 10 times as much and absolutely has over 10 times the margin meaning even if gaming brings in a lot of raw cash, the actual profit from it will be much smaller. You need actual raw profit numbers, not just total cashflow.

Quadro and Tesla's also come with software and developer support where GeForce cards don't so there is extra costs in that product line as well. Also Nvidia has software license fees additional to the hardware features the cards support so while you have the card you may not actually be able to use it without paying yet more money, VDI/Remote Workstations for example.

 

Nvidia has to have an entire company sector for pre-sales support, pre-sales engineering, hardware support and software support which is all pretty well exclusive to that Quadro/Tesla product line so that is where some of that 10x comes from (not that they don't make more from them though).

 

GeForce is a lot more just make the product and throw on to the market and then driver support, along with game developer partnerships etc. Basically it's cheaper to help out with a few games per year than to provide wider scale support to enterprise and cloud provider customers.

 

But you can tell where Nvidia the company is focusing on strategy wise and that will give some clue as to what is more profitable or at a minimum has better projected growth revenue and profit wise, that being enterprise and cloud provider. However I still believe gaming is the single largest part of Nvidia currently, but it's not like they really need to focus on that in any special way. You can see this just with how Nvidia does the GPU dies now, Gx100 being exclusive to Tesla/Quadro and have completely different sub architecture all around (SM make up and memory controller etc) which they design first before any other die or modify for gaming use case.

Link to comment
Share on other sites

Link to post
Share on other sites

how does that steam hardware survey track laptop hardware? no doubts that pascal was a very popular gen and turing sucked ass but i wonder how laptops affect the stats, 1050/1050ti/1060 laptops were a very good price point and available for a long time

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

Plot twist : It's all a hoax, just a psyops by Radeon Technologies Group

Main Rig :

Ryzen 7 2700X | Powercolor Red Devil RX 580 8 GB | Gigabyte AB350M Gaming 3 | 16 GB TeamGroup Elite 2400MHz | Samsung 750 EVO 240 GB | HGST 7200 RPM 1 TB | Seasonic M12II EVO | CoolerMaster Q300L | Dell U2518D | Dell P2217H | 

 

Laptop :

Thinkpad X230 | i5 3320M | 8 GB DDR3 | V-Gen 128 GB SSD |

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, VeganJoy said:

how does that steam hardware survey track laptop hardware? no doubts that pascal was a very popular gen and turing sucked ass but i wonder how laptops affect the stats, 1050/1050ti/1060 laptops were a very good price point and available for a long time

I guess it is just incorporated into it.

 

True, lots of popular gtx 1060 laptops out there. But there are a fair number of 20 series laptops as well. The fact that people didn't see the need to upgrade their laptop with Turing also says something.

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, maartendc said:

I guess it is just incorporated into it.

 

True, lots of popular gtx 1060 laptops out there. But there are a fair number of 20 series laptops as well. The fact that people didn't see the need to upgrade their laptop with Turing also says something.

in fairness, laptop users are probably less likely to upgrade aside from a massive performance bump. people buying laptops probably approach computers differently from a desktop user. you gotta replace the whole laptop, transfer all your data over, lotta hassle compared to just slotting in a new gpu. 9xx series laptops werent spectacular and im pretty sure people buying laptops werent tryna run rtx games lol. plus laptops didnt get super variants afaik. turing was exceptionally shitty value lol, 980ti/1080ti gang rise up :D

topics i need help on:

Spoiler

 

 

my "oops i bought intel right before zen 3 releases" build

CPU: Ryzen 5 3600 (placeholder)

GPU: Gigabyte 980ti Xtreme (also placeholder), deshroud w/ generic 1200rpm 120mm fans x2, stock bios 130% power, no voltage offset: +70 core +400 mem 

Memory: 2x16gb GSkill Trident Z RGB 3600C16, 14-15-30-288@1.45v

Motherboard: Asus ROG Strix X570-E Gaming

Cooler: Noctua NH-D15S w/ white chromax bling
OS Drive: Samsung PM981 1tb (OEM 970 Evo)

Storage Drive: XPG SX8200 Pro 2tb

Backup Storage: Seagate Barracuda Compute 4TB

PSU: Seasonic Prime Ultra Titanium 750W w/ black/white Cablemod extensions
Case: Fractal Design Meshify C Dark (to be replaced with a good case shortly)

basically everything was bought used off of reddit or here, only new component was the case. absolutely nutty deals for some of these parts, ill have to tally it all up once it's "done" :D 

Link to comment
Share on other sites

Link to post
Share on other sites

I'm hoping to get my hands on a founders edition card, looks pretty good for a stock cooler from Nvidia. Although I am tempted to hold off and see what AMD has to offer, i'd be curious if they offer the same performance but for cheaper. 

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I didn't even know GPU mining eth was still a thing, I thought the ASIC's had taken over the end of 2017?

CPU | Intel i9-10850K | GPU | EVGA 3080ti FTW3 HYBRID  | CASE | Phanteks Enthoo Evolv ATX | PSU | Corsair HX850i | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z490E Strix | STORAGE | Adata XPG 256GB NVME + Adata XPG 1T + WD Blue 1TB + Adata 480GB SSD | COOLING | Evga CLC280 | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, jasonc_01 said:

I didn't even know GPU mining eth was still a thing, I thought the ASIC's had taken over the end of 2017?

Might be specific to China.  Iirc coin mining is actually illegal there.  They have special issues with electricity, currency, and banking there. I don’t know how it works. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, jasonc_01 said:

I didn't even know GPU mining eth was still a thing, I thought the ASIC's had taken over the end of 2017?

I actually kind of keep up with cryptocurrencies because I find it all interesting.

Answer to this is no. ASIC's aren't much more efficient than GPU's at mining eth, and because they are so specialized the miners typically avoid them. 

What will bring this to an end is ETH 2.0, which will switch ETH to PoS(proof of stake) rather than Proof-of-work (hardware mining). However, there isn't a hard date on when it will come. Could be this year, could be next, could be after that. 

ETH recently ran up in price and the Decentralized finance apps took off.. causing there to be a surge of transactions. So GPU miners have been having a field day. 6x 1080's were doing like $600/mo at the recent peak. Though it's come back down a bit now. 

When the 30xx cards were announced, we don't really know but mathematically they should be at least 2x more efficient per watt on compute. So in theory they are super desirable because they will literally double miner's profits (which wont happen in practice because everyone will buy them and difficulty will adjust, but people think this way).

Crypto market movements also seems to be following the stock market as of recently (since the crash in march)

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

On 9/8/2020 at 9:34 PM, DuckDodgers said:

35aa0824ab18972bba914390f1cd7b899f510a79.jpg.f666c46f68981d78ea1c00bb6a3762b7.jpg

 

Ethereum Miners Eye NVIDIA’s RTX 30 Series GPU as RTX 3080 Offers 3-4x Better Performance in ETH.

 

 

Sorry Jensen, but as a Pascal owner this might delay the advertised upgrade by a long shot.

 

Sources

https://www.hardwaretimes.com/ethereum-miners-eye-nvidias-rtx-30-series-gpu-as-rtx-3080-offers-3-4x-better-performance-in-eth/

Now I expect the GTX 10-series cards to go up in price again on ebay

Link to comment
Share on other sites

Link to post
Share on other sites

Why don't the miners just build the special circuits ( I can't remember wut they are called right now) ? Those can do a trillion hashes per second or something like that much better than the billions of a GPU

Link to comment
Share on other sites

Link to post
Share on other sites

I think it can be concluded that there are reasons why mining is stupid.  There may be reasons why mining specifically in China is not stupid. I don’t know if they exist or what they might be though.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Justaphysicsnerd said:

Why don't the miners just build the special circuits ( I can't remember wut they are called right now) ? Those can do a trillion hashes per second or something like that much better than the billions of a GPU

ASIC’s have to be built for the algorithm used and if the algorithm mutates it can’t be done on an ASIC only a FPGA. A GPU is basically a very large parallel math processor, not a FPGA or ASIC and GPUs are only good for highly parallel tasks, where as CPU’s are very good at linear serialized tasks.

 

All an ASIC does is run “one program” in fixed function hardware, which means it can be done faster and with less energy than a software solution or FPGA. But that also means if the algorithm changes, the ASIC is completely useless.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bombastinator said:

I think it can be concluded that there are reasons why mining is stupid.  There may be reasons why mining specifically in China is not stupid. I don’t know if they exist or what they might be though.

Isn't most of the bitcoin mining now concentrated to 3-4 big miners? 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, bcredeur97 said:

I actually kind of keep up with cryptocurrencies because I find it all interesting.

Answer to this is no. ASIC's aren't much more efficient than GPU's at mining eth, and because they are so specialized the miners typically avoid them. 

What will bring this to an end is ETH 2.0, which will switch ETH to PoS(proof of stake) rather than Proof-of-work (hardware mining). However, there isn't a hard date on when it will come. Could be this year, could be next, could be after that. 

ETH recently ran up in price and the Decentralized finance apps took off.. causing there to be a surge of transactions. So GPU miners have been having a field day. 6x 1080's were doing like $600/mo at the recent peak. Though it's come back down a bit now. 

When the 30xx cards were announced, we don't really know but mathematically they should be at least 2x more efficient per watt on compute. So in theory they are super desirable because they will literally double miner's profits (which wont happen in practice because everyone will buy them and difficulty will adjust, but people think this way).

Crypto market movements also seems to be following the stock market as of recently (since the crash in march)

The last I was current with crypto, circa 2017  I remember eth ASIC's were taking over and were 10x better than GPU mining and I remember at that time eth was going POS anyways and mini g would be defunct on it.

CPU | Intel i9-10850K | GPU | EVGA 3080ti FTW3 HYBRID  | CASE | Phanteks Enthoo Evolv ATX | PSU | Corsair HX850i | RAM | 2x8GB G.skill Trident RGB 3000MHz | MOTHERBOARD | Asus Z490E Strix | STORAGE | Adata XPG 256GB NVME + Adata XPG 1T + WD Blue 1TB + Adata 480GB SSD | COOLING | Evga CLC280 | MONITOR | Acer Predator XB271HU | OS | Windows 10 |

                                   

                                   

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, jasonc_01 said:

The last I was current with crypto, circa 2017  I remember eth ASIC's were taking over and were 10x better than GPU mining and I remember at that time eth was going POS anyways and mini g would be defunct on it.

lately the hype has been surrounded with DeFi apps. I got to say, they are actually kinda interesting and fun to use if you have time to check out some of the projects like Curve finance, yearn finance, uniswap, aave, synthetix, set protocol, etc. 

Digging into it, I found these way more interesting. It actually adds a lot of depth to crypto beyond just it being what coin should I invest in. Ethereum fees are kinda high though so expect to pay some fees to mess with stuff.

Cool concepts anyway. Still very early stages. *insert not financial advice disclaimer here*

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×