Jump to content
Search In
  • More options...
Find results that contain...
Find results in...


  • Content Count

  • Joined

  • Last visited

Everything posted by skaughtz

  1. I also have an EVGA 2060 KO Ultra Gaming card that is shorter than my 2070 (EVGA Black Gaming) but has sensors for memory temp. That card runs warmer just because of its smaller size and cooler. I realize that it is not apples to apples, but if the 2060 hits a certain memory clock overclock with its memory running at a safe temperature (say, +1000MHz at 74C), would it be safe to assume that the 2070 is not running any higher than that at the same speeds? The 2070 runs cool but does not have sensors for the memory temperature. At +500MHz on the 2060, the memory is reading 72C with 85W total
  2. Thanks, man. Any suggestion on what memory temperatures are safe to run constantly?
  3. I have a 2070 I use for mining Ethereum (but this question should apply to any card). I have the voltage locked in at 700mV, with the core clock at stock, and the memory clock at +500MHz. The card runs at 47C and only pulls 92W. All good. Yesterday I tinkered with the memory clock and managed to get it up to +1350MHz stable, before I decided to look more into what I was doing. The card temperature remained at 47C, but the power draw increased to 99W. The hash rate increased 5Mh/s, but I was concerned that I was destroying the memory, either through heat or stress (there is no
  4. That's exactly my concern. There is a notable hash rate increase with each step from +1000 on, but I have no way of telling what the temps are outside of the main GPU temperature reported, which is 47C, through Precision X1 (all of the cards are EVGA, and I use that program for everything but the voltage, which is locked at 700mV in the mining command line). I figured that by locking in the mV through trex it wouldn't feed anything on the card more than it could handle, since that is quite a bit undervolted to begin with. My original goal was to drop the total power draw. Now th
  5. I would really love some more insight into the whole memory degradation thing. I decided to just mess with my EVGA RTX 2070 Black Gaming to see what I could get out of it. As of me writing this, I stopped at +1350MHz on the memory clock and found that -150MHz was the sweet spot on the core clock. -200MHz core clock would drop the hash rate, while anything above -150MHz didn't make a difference. It is still locked in at 719mV, running at 47C, and I increased the hash rate more than 5MH/s. It still has not crashed, which worries me. Am I burning up something unknown? +1
  6. Thanks for the info. The bit about Pascal and ECC memory makes sense. My 1080 Ti gives me the best hash rate with a -400MHz memory clock. The 1080 and 1070's both showed improvement bumping it up to +500MHz. The Pascal cards also do not like the decrease in power. The 20-series cards didn't drop a blip locking in the power at 700mV. I guess I just need to go through one by one and increase until it crashes. Any advice on the core clocks? My understanding is that the Ethereum algorithm is memory clock reliant, so I have not even bothered touching the core clocks. Since the card
  7. I have my rig set up mining Ethereum and I plopped a +500MHz memory clock offset on all of my mining cards. 2070, 2060 Super, 2060, 1080, and 2 1070s. Everything but my 1080 Ti (which needs it reduced because 1080 Tis are the spaz of the GPU world). All of the cards are fine and +500MHz seemed like a reasonable offset. But I have seen people discuss pushing the memory clock as high as it will possibly go stable, sometimes putting something like a +1000MHz or more memory clock offset on their cards. As I would like to reuse the cards for other purposes when they are done mining,
  8. Just playing Devil's Advocate here, but I would put it on them to put the manpower towards pouring through the blockchain with my wallet address (when they found it... probably through IP connections or something) to confirm that I received X share on X date. That is probably a pretty solid roll of the dice on the part of the basement miner. They don't have enough resources to deal with people filing normal taxes. It seems silly to me that they would even attempt to collect on the income when there are so many barriers in the way to it being at all accurate. Anyway, f
  9. Okay. This is what I was after. Hypothetically for the sake of argument, outside of storing it in an exchange wallet, they could not track it back to you though, correct? Cold wallets can be lost. Hot wallets don't collect identifying information. I'm sure with the latter they could figure it out if they really, really tried, but that isn't happening for your average citizen. I'm not trying to skirt paying my taxes... only they and death... I am just curious as to the ins and outs of this whole thing being relatively new to it. There seem to be some gray areas here.
  10. But how would they know if you mined it on April 7, 2016 or April 7, 2021? Pretty big difference in what you would owe them. And while you might not be able to prove it, neither could they (I imagine without expending WAY more resources than they have available for your average home miner), and you aren't guilty until proven innocent.
  11. I'm not worried about me. They have far bigger fish to fry than me and my laundry room rig. But I do like to know the ins and outs of things that I am involved in and this just seems silly that the best tactic that they can employ is only fear of audit... which anyone with some sense should be able to get out of at this point, for the reasons I mentioned above. Uncle Sam always gets his in the end, but I see why crypto pisses off the government so much.
  12. Hm. This seems to be something of a shitshow. Some Googling came up with "Pursuant to IRS Notice 2014-21, when a taxpayer successfully “mines” virtual currency, the fair market value of the virtual currency as of the date of receipt is includible in gross income. This means that successfully mining cryptocurrency creates a taxable event and the value of the mined coins must be included in the taxpayer’s gross income at the time it is received." That isn't feasible in the least, lest you sit at the mining computer every second of the day and record the time/date/value
  13. I've recently been mining ETH and have been storing it away, hoping for the price to rise again. I have also been reading about how the IRS plans to come down harder on crypto profits through capital gains taxes. What I don't understand is how can they know/verify how long you have held onto the coin if you store it in a non-exchange wallet? It would be the difference between short- and long-term capital gains rates. Would they only go by when it was transferred into/out of an exchange wallet and when it was sold/traded? What if you mined it and have held it for over a year in a Trezor on
  14. Passengers was an underrated sci-fi flick. I am only concerned because if/when the crypto market crashes, I will probably throw all of these back into the gaming computers that they were pulled from, which have now been turned into simple HTPCs. If I can sacrifice a few bits of coin for extending their lives, I would like to.
  15. 25 year computer hobbyist, first time miner. I dropped 7 of my GPUs into an Ethereum mining rig (1080Ti/1080/2070/2060S/2060/1070/1070). I have it in my laundry room (underground and 10 degrees Fahrenheit cooler than the rest of the house) exposed to open air, with a box fan blowing over the whole setup. Each card is basically overclocked to +500MHz on the memory, -200MHz on the core clock, and running at 75% power. With the box fan blowing over them, all but a tiny half-sized 2060 is running at around 50 degrees Celsius or less with the fans set at 45%. Without the box fan mos
  16. New to mining and going to set up a rig using all of the GPUs I have in the computers throughout my home. The rig will consist of: 1080 Ti 1080 2070 2060S 2060 2060 Asrock H61 Pro BTC Pentium G2020T (35W TDP) 2 x 4GB DDR3 1600 I have three power supplies that I can use to power the rig. A 750W Gold, 650W Gold, and 620W Bronze. My plan as of now is to use the 750W to power the board, 1080Ti and 1080, then use the other two to power 2 of the remaining cards each (with the 650W gold powering the 2070 and 2060S and the 620W
  17. I'm just going to go with the Xeon, especially since the kid is 9 (10? I'm a terrible uncle...) years old and wouldn't be able to troubleshoot if anything hiccupped with an OC. With that said, I am now curious to blast the 3570k as high as it will go and see how it performs by comparison. People like to blast the earlier generation Intel Core series now (and rightfully so sometimes), but I still maintain that they punch above their weight given their age. I still love Ivy Bridge.
  18. Ah. Well good to know. Thanks. I will have to play around with that. So if it is stable and cool, would you take a 4.5GHz 3570K over a 3.7GHz Xeon? Would 800MHz outweigh hyperthreading nowadays?
  19. I was in the same boat with two 1155 systems. I went 3770K with one, got that up to 4.5GHz, and at 1080p 60Hz, matched with a 1080 Ti I am more than fine. A second option is to look for is an 1155 Xeon. They used to be quite cheap by comparison ($40ish) for 4 cores/8 threads. You can't overclock them, but you get the extra threads (hyperthreading) out of them for far less money. The current used computer parts market has probably bumped them up in price, though. Still, it would be far cheaper than upgrading your entire system. Assuming that you are upgrading for games, it rea
  20. I was using a Cryorig H7 on the 3570K when overclocking it. Basically a Hyper 212 EVO, if not a couple degrees cooler. I suppose I shouldn't care at this point about the longevity of the chip, but I've always liked to use as little voltage as possible on my CPUs and that is a tough habit to break for an extra 300MHz.
  21. If I recall correctly, I have that disabled as I read that it can sometimes raise temperatures a bit. I could be mistaken, though.
  22. Thanks. This is what I needed. I figured the extra threads would pay off more than the speed, but wanted to double check. Now to decide how generous I am feeling and decide between giving him my extra 1070 or 1080...
  23. I can get the 3570K to 4.2GHz on a -0.02V offset, but have to pump it up to +1.0V to get to 4.5GHz, along with a 25% LLC. Temps are way too high for my liking at that point. 4.3/4GHz doesn't run on the negative offset, so I never bothered with it for 100/200MHz.
  24. I have an older system that I am going to give to my young nephew to play games on, but I have to decide between two CPUs. What is more important nowadays: core speed or total threads? I have to choose between a 3570K (4c/4t) at 4.2GHz, or a Xeon e3-1270V2 (3.7GHz all core, 3.9GHz 1 or 2 cores) with 4c/8t. I know that for the longest time 4 core speed was king, but has that changed? It is my understanding that RTS games especially like more cores/threads, but I am unsure about other genres; open world, first person shooters, Minecraft, etc. I know that there is no on
  25. This is why I was curious :). It seems like a good idea in theory but before I go deconstructing everything I don't want to be left in a situation where games are stuttering or lagging because it is now traveling across the network.