Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Kisai

Member
  • Content Count

    3,748
  • Joined

  • Last visited

Reputation Activity

  1. Agree
    Kisai got a reaction from AidyyJ in Speakers Only playing out one side   
    That might be a TRRS connection then. TRS is stereo, TRRS is stereo+mic, and while they're they same length, the placement of the connector to avoid shorting the mic would be different.
  2. Agree
    Kisai got a reaction from Jet_ski in China’s Inner Mongolia Declares War on Crypto Mining   
    ASIC's end up being waste from three angles, the waste of the materials, the waste of the energy to mine the materials, and the waste of the ASIC's when they're no longer profitable. Honestly, it's one of those cases where the people who actually made money are the ones who sold the miners.
     
     
    If you've ever seen how businesses dispose of computer hardware, it's always stuff that has little or no value, and it's treated like garbage. Some third party comes in, throws things into dumpster, and hauls it away. No care is taken. That's why I'm saying the GPU's in ex-mining equipment are very likely run into the ground and then thrown away when something can replace them.
     
    The roadblock is really how energy is, or isn't regulated.
     
    In a deregulated market (see Texas, Enron) the generation sources and transmissions sources can pretty much toast miners if they wanted. However dirty-energy producers don't care about that, if they have to shut down, they lose money, so they will keep burning coal and natural gas regardless of the input costs until there are no customers buying. 
     
    If energy is regulated, then the energy producers can not make an unapproved change in costs to consumers. So green energy usually benefits from a regulated market because their energy costs are more expensive, while dirty energy ends up subsidizing green energy as they're forced to sell it at the same rate. This also means that IPP (independent power producers) can rip off the regulated market, driving up costs for everyone.
     
    So back to mining, If miners are buying energy from green IPP's, or their own green energy generation, then there is no net environmental cost, they're just producing cryptocoins and not hurting everyone in the process, other than maybe causing shortages in mining equipment and generation/transmission hardware. I doubt there are any cryptominers doing this because it's just cheaper to buy from a regulated state utility than deal with any of this.
     
     
  3. Agree
    Kisai got a reaction from Jurunce in China’s Inner Mongolia Declares War on Crypto Mining   
    A better context is "well if everyone stopped recycling, we would run out of newsprint, aluminium, and steel within a decade" 
     
    Steel mostly goes doesn't get recycled. Those shipping containers coated in toxic chemicals from china? They rarely get used to ship anything back, and turning them into tiny houses is a bad thing since the amount of energy to clean them is as much as making new ones.
     
    The point is. there is nothing recoverable from cryptomining.
    - Used ASIC miners? Garbage.
    - Energy used to mine bitcoin and others? Gone. Nothing recoverable.
    - Used GPU's? Likely going to the landfill when better stuff comes out. A small amount might get resold if they're less than 2 years old. But who's going to dismantle a cryptofarm and stop making money for a day just to do that.
     
    Quite frankly anyone championing the current generation of cryptocurrencies is either an idiot, or desperately trying to justify their foolish investment. This is not going to last. All that needs to happen to terminate all cryptocoin mining in any particular state or country is for the energy cost to start charging anyone using more than 15KW/day 10x the price from 6am to 6pm unless they have Solar cells that they can switch to. In which case if it drives people to install Solar... good. Otherwise all it's doing is driving up the cost of energy for everyone.
     
    Until I can buy a soda from a vending machine with a cryptocoin and not be charged more than the cost of the soda, it shall remain a fad.
  4. Like
    Kisai got a reaction from cunninghaaam in Pc I built over 2 weeks ago turned off, will not reboot   
    Always swap one part of a time.
     
    If it's in standby mode and won't turn on, there's a possibility that the NVRAM settings (eg for RAM) are wrong and it can't boot. But I've mostly seen "won't turn on" in the context of bad bios settings/wrong timings for the cpu/ram.
  5. Like
    Kisai got a reaction from cunninghaaam in Pc I built over 2 weeks ago turned off, will not reboot   
    It's possible if the RAM is the wrong speed for the CPU.
  6. Like
    Kisai got a reaction from cunninghaaam in Pc I built over 2 weeks ago turned off, will not reboot   
    You never know. On my old DDR2 system, two different sets of memory sticks at the same speed, one caused the motherboard to constantly think the firmware needed to be reflashed. I'm not suggesting that, but I am suggesting that if the system automatically used the XMP profile it might have caused this.
  7. Agree
    Kisai got a reaction from Jet_ski in China’s Inner Mongolia Declares War on Crypto Mining   
    A better context is "well if everyone stopped recycling, we would run out of newsprint, aluminium, and steel within a decade" 
     
    Steel mostly goes doesn't get recycled. Those shipping containers coated in toxic chemicals from china? They rarely get used to ship anything back, and turning them into tiny houses is a bad thing since the amount of energy to clean them is as much as making new ones.
     
    The point is. there is nothing recoverable from cryptomining.
    - Used ASIC miners? Garbage.
    - Energy used to mine bitcoin and others? Gone. Nothing recoverable.
    - Used GPU's? Likely going to the landfill when better stuff comes out. A small amount might get resold if they're less than 2 years old. But who's going to dismantle a cryptofarm and stop making money for a day just to do that.
     
    Quite frankly anyone championing the current generation of cryptocurrencies is either an idiot, or desperately trying to justify their foolish investment. This is not going to last. All that needs to happen to terminate all cryptocoin mining in any particular state or country is for the energy cost to start charging anyone using more than 15KW/day 10x the price from 6am to 6pm unless they have Solar cells that they can switch to. In which case if it drives people to install Solar... good. Otherwise all it's doing is driving up the cost of energy for everyone.
     
    Until I can buy a soda from a vending machine with a cryptocoin and not be charged more than the cost of the soda, it shall remain a fad.
  8. Agree
    Kisai got a reaction from Cavalry Canuck in Need advice from Canadians on builds and moves.   
    Take the GPU and hard drive in your carry-on. Sell the rest.
     
    While CPU parts are still hard to get, usually it's only the high end parts that are rarely in stock. The low/medium-end parts are still available (eg Ryzen 5 3xxx and Ryzen 7 3xxx)
     
    The only reason I haven't built a new rig is that the 5xxx parts are not available and the Intel parts are not a good value, but you can get by with whatever is available.
  9. Agree
    Kisai got a reaction from tikker in Bitfinex sets record with titanic $4.5B and $5B BTC transactions   
    The thing is, by cracking down on wasteful things, that (in theory) should direct newer generations of e-currencies ideas and policies to operate that: 
    a) perform fast transactions
    with
    b) less energy waste
    and
    c) less environmental impact (from both the energy cost of producing the hardware and operating the network)
    and
    d) cost next-to-nothing to use so it can actually be used for daily transactions and impulse transactions, not just big-ticket items. 
     
  10. Agree
    Kisai got a reaction from TOMPPIX in Bitfinex sets record with titanic $4.5B and $5B BTC transactions   
    Bitcoin is quite literately doing what everyone knew would happen. All the "value" ends up in the pockets of a few, who already had deep enough pockets to invest in the mining in the first place. So they will turn around and dump their speculative holdings when they see it won't rise any farther, leaving whoever is stupid enough to be left holding the bag with bitcoins with no value. So the "rich" people will have gotten out a month before the value crashes, and then everyone else will want out, but find nobody willing to buy it, so the price will rapidly crash.  Because bitcoin has no value in the first place, whoever is last holding it is likely just to abandon the wallets. Then that's it, Bitcoin and all competing cryptocurrencies are dead.
     
    The "fees" for trading are far too high for anyone to be willing to use it for any purchase, and too slow to use for impulse purchases. So let's reverse the numbers here. If the crypto exchange fee is $20.00, and energy is $0.10 , $200 in energy is wasted per transaction regardless of the transaction's value (Assuming that was the break even point.)
    So the amount of money you have to exchange to make it "better" than:
    Western union (4% of principal, or around $50 per 1000, depends on country)  = $400
    Walmart (10% of principal) = $200
    Paypal ($4.30 +2.9% of principal) = $541
     
    Like never mind shipping checks or other monetary instruments through the mail or courier.
     
    If Bitcoin or some other crypto currency ever establishes itself, it will only be for very-high value exchanges, think mortgage payment or rent. And only as an interstitial. Nobody in their right mind would hold onto cryptocoins as a store of value with the kind of volatility it has. 
     
  11. Agree
    Kisai got a reaction from Stormseeker9 in Bitfinex sets record with titanic $4.5B and $5B BTC transactions   
    Bitcoin is quite literately doing what everyone knew would happen. All the "value" ends up in the pockets of a few, who already had deep enough pockets to invest in the mining in the first place. So they will turn around and dump their speculative holdings when they see it won't rise any farther, leaving whoever is stupid enough to be left holding the bag with bitcoins with no value. So the "rich" people will have gotten out a month before the value crashes, and then everyone else will want out, but find nobody willing to buy it, so the price will rapidly crash.  Because bitcoin has no value in the first place, whoever is last holding it is likely just to abandon the wallets. Then that's it, Bitcoin and all competing cryptocurrencies are dead.
     
    The "fees" for trading are far too high for anyone to be willing to use it for any purchase, and too slow to use for impulse purchases. So let's reverse the numbers here. If the crypto exchange fee is $20.00, and energy is $0.10 , $200 in energy is wasted per transaction regardless of the transaction's value (Assuming that was the break even point.)
    So the amount of money you have to exchange to make it "better" than:
    Western union (4% of principal, or around $50 per 1000, depends on country)  = $400
    Walmart (10% of principal) = $200
    Paypal ($4.30 +2.9% of principal) = $541
     
    Like never mind shipping checks or other monetary instruments through the mail or courier.
     
    If Bitcoin or some other crypto currency ever establishes itself, it will only be for very-high value exchanges, think mortgage payment or rent. And only as an interstitial. Nobody in their right mind would hold onto cryptocoins as a store of value with the kind of volatility it has. 
     
  12. Agree
    Kisai got a reaction from GDRRiley in The hack of popular streamer Dellor - how did it actually happen and how can people protect themselves from this particular type of attack?   
    Sounds possibly more like spear-phishing. If someone is hated for some reason, you can bet people will try to bait them into doing something.
     
    Like enterprise banking involves installing security tools. Consumer, nah.
     
    Though I'm highly skeptical when people say they were hacked. Usually only one of two things are true:
    a) They did something stupid and are embarrassed about it/caught in a lie
    b) They want to hide something illegal/immoral/unethical
     
    Like when MMO players claim they were hacked, they almost certainly downloaded an unauthorized tool, and that tool stole their login token in the background. There is a reason why cheat tools always trip AV products, and have done so since games were on floppy disks.
     
    The only difference between Windows Vista and later from earlier OS's in this regard is that people often get told to "run it in Admin" as a troubleshooting step, and that's just plain bad advice. Like streamers keep getting told to run OBS or Streamlabs in Admin, which just leads to a chain of "always run everything in admin to stream with it", so it would not surprise me if this was the case, but my money is on "wants to hide their losses on robinhood"
     
  13. Agree
    Kisai got a reaction from Mellowie in Report: Stadia Blew Millions On Red Dead Redemption 2 And Other Ports   
    Because that's Google's entire modus operandi. Build bad products that are barely into beta testing, entice people to use them to work the bugs out, and then cancel it when it's not successful enough.
     
    Like look at the killed by google list, and you'll see a lot of stuff that people liked, but because it wasn't liked by a billion people, it was canceled. The average product there is killed after 3 years.
     
     
    Again, that's Google's MO. They get so excited to release something, that it's always half baked, and while nerds are fine with testing things, the general public hates change, and that's a sure fire way to get people to stop using a product. You know what google product has been relatively unchanged since inception? Google Search. Google learned one lesson from Yahoo/altavista/excite early on that they are not a content portal, and to not clutter the front page.
     
    Now, if only they learned that lesson for youtube and stadia.
     
     
    Google's fatal mistake here was assuming anyone wanted Stadia in the first place. The controllers? Pretty much garbage, an Xbox 360 controller from 2005 was better. They bloody glued the things together so you can't even fix them, talk about cheap. They should have figured out how to use existing controllers first. The Chromecast? off-the-shelf product they were already producing, and it overheats. Yet another cheap product Google should have worked out an "app" version that would run on Android devices first, but most Android devices are also rubbish-tier and can't run Stadia either. Same with Google-TV products, most are barely able to push a 1080p30 video, because as I keep saying in this thread, they are the minimum-of-minimum's.
     
    Google could, and really should, stop treating customers like consumer idiots and actually raise the bar on what is "minimum" required for these products, because it's quite frankly embarrassing to the Google Brand, when google releases things and then turn arounds and cancels a thing 3 years later because of lack of adoption... well maybe if you'd tell the manufacturers who are making your products to produce something that isn't garbage, maybe people would use it?
     
    What could have helped Stadia:
    - Work on anything that can receive a Chromecast in the first place. This means that Chromecast receivers need to support AV1 so 4K gaming is even possible. All existing chromecast devices do not, including many smart TV and cable STB's. This horse has long left the building. Existing chromecasts only support 1080p, and even the new Ultra doesn't support Stadia.
    - Dedicated apps on Mac/PC/iOS/Android, not web browsers. I can't explain this clearly, but if you have more than one google account, Stadia in the web browser will keep using the wrong one, and that's just another reason why it's a huge pain in the ass to use.
    - Integrated with Youtube, from the beginning, including handling the licensing of the game and any music used for streaming. This IMO is a major blunder. Maybe I want to stream Stadia, the built in streaming exists, but so far I've been unable to use it, either stonewalled by wrong-account or the attached youtube account has a problem because... wrong account. If I ever get past that point, they aren't going to help you ( https://support.google.com/stadia/answer/9825342?hl=en&ref_topic=9825535 ) if the game contains licensed music (eg CP2077) 
    - Crowd Play should have been there on day 1, and supported by every game. It's not. https://support.google.com/stadia/answer/9824891?p=crowd_play
     
    Ultimately, Google's entire attitude of "this is a cool product everyone will want to use, let's just release it and let people use it" tends to hurt it more often than it succeeds. The only products that ever worked like this is the Google Search itself, and Gmail, and google has taken away the "unlimited storage" feature of gmail in the last two years which was promised to people who first signed up to it.
     
    This is why Google is not your friend. It's perpetually a bait-and-switch. "Here's a cool product, *promises of other features that will never appear in this film*"
     
  14. Like
    Kisai got a reaction from Luka95 in What GPU is doing when you watch a youtube video?   
    Depending on your browser.
     
    If you have a dGPU, the browsers will use the underlying hardware decoder IF it's available, and fall back to software. 
     
    There's a catch there though. You know those video ads every website runs now? Those use video decoding resources. In fact only the first video really makes use of the video decoder. Even if you have a second hardware decoder available. Once that video decoder hits the maximum it can decode, any additional requests to decode are kicked to software decoders.
     


     
    You likely won't even notice this unless you are watching the video engine data.
     
    so right now, I'm watching a single 720p stream full screen at 4k (though the web browser is actually rendering at 1920x1080), it uses 10% of the video decode, BUT 75% of the 3D GPU for some reason.
     
    Now if I exit full screen, the 3D GPU goes down to 3% and the video decode goes down to 4%. What changed?
     

     
    The only difference between the beginning and the end of the charts here are Firefox being full screen with twitch. Nothing has been closed.
     
    Now If I repeat that with youtube on Chrome. 1080p video, vp9. 3% Video Decode, 2% 3D while windowed, and if I switch to full screen:

    37% 3D, 14% video decode.
     
    So by all accounts, I should in theory be able to run 8 1080p videos on the video decode engine, however the upscaling being done by either Chrome or Firefox will only allow one or two videos (eg two monitors) before the Video subsystem is overwhelmed.
     
    So, what actually happens when you do overload it?
    Well.. I couldn't.
     

     
     
    6 windowed youtube videos, topped out at 40% on the video decoder and 8% on the 3D engine. However the CPU was at 40%. However they were all dropping frames.

    After I closed all but one of the videos, the cpu dropped from 40% to 13%. Running just one video, has negligible load on the cpu, and gpu. It's as soon as additional videos run that the load increases (on the cpu primarily.)
     
    So the take away from this is that if you have hardware decoding, you can run as many videos as needed on it (eg h264, vp9) until the decode engine is maxed out. Now, caveat, the reason I believe it maxed out at 40% is because these were all windows, and because they were all windows, they likely were automatically switching to 720p streams, because web browsers do not operate in 4K.
     
    To reframe this another way, the video decoders aren't "one video, period" they are only passing the bitstream to the decoder, so other things like audio and the actual file container are almost guaranteed to be in software.
  15. Informative
    Kisai got a reaction from Jayzer in What GPU is doing when you watch a youtube video?   
    Depending on your browser.
     
    If you have a dGPU, the browsers will use the underlying hardware decoder IF it's available, and fall back to software. 
     
    There's a catch there though. You know those video ads every website runs now? Those use video decoding resources. In fact only the first video really makes use of the video decoder. Even if you have a second hardware decoder available. Once that video decoder hits the maximum it can decode, any additional requests to decode are kicked to software decoders.
     


     
    You likely won't even notice this unless you are watching the video engine data.
     
    so right now, I'm watching a single 720p stream full screen at 4k (though the web browser is actually rendering at 1920x1080), it uses 10% of the video decode, BUT 75% of the 3D GPU for some reason.
     
    Now if I exit full screen, the 3D GPU goes down to 3% and the video decode goes down to 4%. What changed?
     

     
    The only difference between the beginning and the end of the charts here are Firefox being full screen with twitch. Nothing has been closed.
     
    Now If I repeat that with youtube on Chrome. 1080p video, vp9. 3% Video Decode, 2% 3D while windowed, and if I switch to full screen:

    37% 3D, 14% video decode.
     
    So by all accounts, I should in theory be able to run 8 1080p videos on the video decode engine, however the upscaling being done by either Chrome or Firefox will only allow one or two videos (eg two monitors) before the Video subsystem is overwhelmed.
     
    So, what actually happens when you do overload it?
    Well.. I couldn't.
     

     
     
    6 windowed youtube videos, topped out at 40% on the video decoder and 8% on the 3D engine. However the CPU was at 40%. However they were all dropping frames.

    After I closed all but one of the videos, the cpu dropped from 40% to 13%. Running just one video, has negligible load on the cpu, and gpu. It's as soon as additional videos run that the load increases (on the cpu primarily.)
     
    So the take away from this is that if you have hardware decoding, you can run as many videos as needed on it (eg h264, vp9) until the decode engine is maxed out. Now, caveat, the reason I believe it maxed out at 40% is because these were all windows, and because they were all windows, they likely were automatically switching to 720p streams, because web browsers do not operate in 4K.
     
    To reframe this another way, the video decoders aren't "one video, period" they are only passing the bitstream to the decoder, so other things like audio and the actual file container are almost guaranteed to be in software.
  16. Agree
    Kisai got a reaction from FakeKGB in [RUMOR] AMD EPYC 7004 Genoa Zen 4 CPU Allegedly has 96 cores and 192 threads with 12-channel DDR5-5200 memory support   
    Considering that getting 24, let alone 96 GPU's or USB ports is going to be an issue.
     
    One of the key reasons high density cpu's don't end up in desktops is that they typically run at half the clock speed. You know, at a speed that a game will be noticed, but server applications will not.
     
  17. Like
    Kisai got a reaction from metaleggman in Windows Keeps Losing Ethernet, Switches to WiFi   
    Ok, so your WiFi adapter completely disappears even when the wired connection is available. That's actually normal in laptops that are told to turn the WiFi off to save power when connected to a wired connection/docking station. 
     
    So that means something is actually causing the wired ethernet to drop carrier long enough for this switch to happen. The event log is actually pointing to the Hyper-V (virtualized adapter), which means that it's possible that the wired adapter is being reassigned to the virtual machine for some reason.
     
    If you are actually using Hyper-V (eg VirtualBox, Bluestacks, MEMU, etc) you might need to fiddle with the settings so that it emulates a network adapter rather than reassigns it. Though I reasonably suspect this is not the problem, as I've never had that problem with VirtualBox or MEMU. 
     
    As a way to determine if the cable is damaged or too long, force the wired adapter (at either end) to operate in 100Mbit mode and see if it does the same thing. It could be as simple as aggressive power management (which actually saves power by lowering the link rate) to having a kink/bend in the ethernet cable. 
  18. Like
    Kisai got a reaction from metaleggman in Windows Keeps Losing Ethernet, Switches to WiFi   
    I'd suggest getting a copy of the settings when it's connected to different router's and compare if TOE (Tcp offload engine) or Jumbo Packets are enabled on one and not the other.
     
  19. Like
    Kisai got a reaction from Moonzy in Intel's secret weapon could make your next laptop upgrade much cheaper - Intel Compute Element 11 released   
    The MXM was primarily Dell.
    https://www.dell.com/support/kbdoc/en-ca/000153331/what-is-the-mxm-mobile-video-module-specification-kb-article-351318
     
    The thing is, Dell no longer uses it. The last laptops (Precision M6800) were haswell/sandy bridge systems. The Precision 7000 systems are all single-board motherboards, and the Precision 5000/XPS 15's are "thin and light"'s that the MB is entirely exposed to the bottom of the laptop. 
     
    Like, devil's advocate... If you had a laptop that consisted of one MXM and one NUC CE, all you'd be left with are chassis, ports, and M2 slots for the SSD and WiFi. This would end up being three PCB's (left ports, right ports, and MXM to CPU CE bridge) and likely have a bunch of flat cables to connect the boards, otherwise you'd have to pre-define the NUC and MXM sizes for each laptop size. The obvious problem here is that thin and lights are already super-hot to handle, so every chassis would have to be designed to take the maximum TDP.
  20. Like
    Kisai got a reaction from dalekphalm in Intel's secret weapon could make your next laptop upgrade much cheaper - Intel Compute Element 11 released   
    If you think back to ye old Amiga/Atari systems, it does make sense.
     
    If you want the ability to run two workflows without having to sacrifice the space for both. So I could see this being useful in a situation where the plugged in compute element operates in a VM in another OS, and you can run a Linux-based Neural Net application on it, but I'm incredibly skeptical of this being useful in any situation but headless-server-inside-another-computer. Especially since you can't operate it at desktop power settings.
     
    https://www.intel.com/content/www/us/en/products/docs/boards-kits/nuc/elements/nuc-elements-brief.html
     

     
    Yeah, that's definitely not a computer-in-a-computer intent. That's more of a "upgradable NUC, buy throwing away the NUC and keeping the ports/SSD/cooler (eg the cheapest part)"
     
    So in effect, it's more like a computer you would have bolted into your SUV/RV than one you'd seriously use as a desktop. So field-replaceability probably is the intent more than any other use.
  21. Like
    Kisai got a reaction from Hymenopus_Coronatus in Nvidia Mining GPU CMP HX: The state of mining from 2 perspectives and a meme no one asked for!   
    No single gamer has a reason to own more than 4 GPU's. None. If you're into ML (Machine Learning), maybe that changes, and having a rig built for training and not gaming might be of value, and when it's not training, it's mining or powered off.
     
    My issue is when people go "oh you're leaving money on the table", no screw that. If I burned out the GPU in my desktop, I would have no GPU at all, and no means of replacing it. It's absolutely foolish to be mining on the single GPU in your desktop system, or any GPU in a laptop.
     
    So no, my ire is directed at people who see mining as free money, "capitalism ho!" nonsense, not those who mine in the background while they work on excel documents. If electricity is more than $0.10USD/kwh in your area, you are also paying a premium to mine that others do not.
  22. Like
    Kisai got a reaction from LogicalDrm in Is there still a demo scene?   
    http://www.pouet.net/ Stuff old and new
    https://hornet.org/ PC demos up to 1998
    https://demozoo.org/ More New stuff
     
    What you don't really see a lot of is "new stuff on new hardware" , a lot of new hardware is more locked down (such as consoles and phones)
     
    https://www.vice.com/en/article/j5wgp7/who-killed-the-american-demoscene-synchrony-demoparty
     
     
    So, the demo scene still exists, it's just largely not an American thing. Demo's were a consequence of trying to make do with what was available, and what's available now is rather consistent (x64 Windows 10 PC's), only the scale is different.
     
    Like compare the absolutely worst computers being sold today (chromebooks and android phones) to the highest desktops with i9's and 3090 Ampere GPU's , and there is the potential for some interesting stuff to be built, but the gap in performance is still only going to have you target the lowest end device that people could reasonably have, and that demo may not work if the OS changes.
     
    Then there is the malware aspect. People are a lot less willing to download things from the internet that they have no reason to download. Demos were once synonymous with piracy, and the demos were usually ads for BBS systems that no longer exist.
     
  23. Like
    Kisai got a reaction from birdflyer in Should I buy a used YubiKey?   
    I wouldn't. For all you know any used one might be compromised.
     
  24. Informative
    Kisai got a reaction from linuxChips2600 in Gaming Performance Tested On 'Worn Out' RTX 2080 Ti Mining Card   
    None of the nVidia GPU's earn more than $13/day a piece. You can deliver a single pizza and earn more in a day. Also the price of Ethereum crashed like 25% yesterday. 
     
    A living wage is $20 x 8 hours a day = $160. 40 hours per week x $20 = $800 is the bare minimum to hit that where you don't need a pizza job. There are 168 hours in a week. So assuming you use the computer for nothing but mining, you might make $70 per week. So how many GPU's do you need before you theoretically don't need to prioritize a real job? Likely 12 $2500 CPU's.
     
    So figure it out.
    3090 ($3800/ea * 13)= $12.97/ea , input cost is $49400 to make $168.61/day, Recovery time 293 days.
    3080 ($3500/ea *13) = $12.84/ea, input cost is $45500 to make $166.92/day, Recovery time 273 days.
    3070 ($1400/ea * 25 ) = $6.49/ea, input cost is $35000 to make $162.25/day, Recovery time 215 days.
    3060Ti ($1500/ea * 25) = $6.58/ea, input cost is $37500 to make $164.5/day, Recovery time 228 days.
    3060 ($____/ea * 46) = 3.54/ea, input cost is unknown, to make $162.84, Recovery ?.
     
    A 2080Ti is the same rate as a 3070. So the most efficient card in that list is the 3060Ti, which is at a rate equal or better than the 3070. This excludes any efficiency gains or losses from any other hardware.
     
    $160/day = $41600/yr. So assuming you didn't pay taxes, rent/mortgage or eat any food, you would have to sink likely 2 years salary at existing scalper prices, and would only recover that cost after 9 months. That's an unrealistic expectation.
     
    So no, I think people spending $2500+ on a GPU now to make $10/day are fools who are parting with their money.
     
    Let's re-roll the numbers with the original MSRP:
    3090 ($1500/ea * 13)= $12.97/ea , input cost is $19500 to make $168.61/day, Recovery time 116 days.
    3080 ($700/ea *13) = $12.84/ea, input cost is $9100 to make $166.92/day, Recovery time 55 days.
    3070 ($500/ea * 25 ) = $6.49/ea, input cost is $12500 to make $162.25/day, Recovery time 77 days.
    3060Ti ($400/ea * 25) = $6.58/ea, input cost is $10000 to make $164.5/day, Recovery time 61 days.
    3060 ($____/ea * 46) = 3.54/ea, input cost is unknown, to make $162.84, Recovery ?.
     
    So yeah, maybe at those prices and 3080 might look cost effective to mine when you're not using the computer for anything else. The rest of those? Definitely not worth it, particularly the 3090. That likely explains the scalper price differential on eBay. The 3080/70/60ti aren't entirely unreasonable, but the amount of space wasted along with ewaste, waste heat, generated on the cheaper cards is ridiculous.
     
    At the original price, dumping $10,000 on GPU's doesn't look smart, but if you don't need a day job after that point, that's a rather attractive option. Dumping $45,500 though, no, that's insanity.
     
  25. Funny
    Kisai got a reaction from Mark Kaine in Taking PC as checked in luggage, got it all figured out aside from one thing, packing material that is esd safe for the inside.   
    Do realize that:
    a) Airlines routinely drop luggage from like 20ft in the air
    b) Heavy, imbalanced luggage will usually fall and damage anything inside it that isn't clothing.
     
    If you want to pack the inside, don't use plastic/polyester or other synthetic cloth/foam. You can wrap the outside with cloth and foam as long as the expansion ports are covered with cardboard. But really I would not ship a computer by air. At least not in the original chassis box and packing material.
     
    Honestly, the safest way to transport a desktop is by personal car. The second best way is by fedex/ups/dhl. Both of these options are going to be much more expensive. 
     
    If you don't have that option, find whatever cardboard you have (eg cereal box cardboard will do) and just fold it up into toilet-paper tube like shapes and just stick those between the expansion cards and voids so that if they do experience a shock, they aren't hitting something that will break the pcie edge off. Most expansion cards don't have much on them in the first place, it's mainly GPU's that are heavy enough to break the PCIe card slot.
     
     
×