Search the Community
Showing results for tags 'asic'.
-
there is old asic miners profitable for me more than new ones but I am somewhat worried about malfunctions and problems and the difficulty of managing old devices and their lifespan, so advise me where to go. like Antminer S19j Pro+ 122TH vs Antminer S21 Hyd 355TH
-
So basically what the title says, the reason I want to do this is because I sneakily want to use their electricity for an ASIC miner that I have but the only way to controll it is by acessing its ip while on the same network, also hue lights, and their Wifi is terrible. The problem is that I cannot just use a ethernet switch I don't think because my school requires me to sign in using my credentials, and past that there is no way for devices to connect to eachother over their network anyway. So first of all, will this even fix my problem, I think it should since i would have all of the devices on the same network but I really have no idea. I know several people that have done this and did not get caught I just really want to make sure that I do not. I saw a reddit post that said change the IPV of the router to a different range than the one the school uses so that you do not interfere with the schools network and to name it something like a printer or PC. Is there any other things that I should do to not get noticed, I was also going to put a VPN on the router because I already use one 24/7 when connected to their network anyway. For reference I have an Orbi router if it matters that I have had lying around. Any help would be great or if there is another way to get around this problem that would be helpful too.
-
Hello guys, I saw something counter intuitive on GPU-Z today and it's just now that I've noticed it... There's written that lower ASIC quality is better for overclocking under water or LN2 or dice (don't know about that), but worse on air. Can anyone tell me if that's a mistake or if it's true, then why is that so?I'm sososo confused right now... Thanks
- 3 replies
-
- overclocking
- asic
-
(and 1 more)
Tagged with:
-
Hello guys, I just thought of this... how can a software read ASIC quality to that accuracy degree? What process goes into benchmarking a GPU's ASIC exactly? I wanted to know just out of curiosity. Thanks for any answers
-
Monero forked to Monero7 last night, rendering all ASIC miners useless. Yesterday the Nethash was over 1GH/s. Although pool Nethash calculators have not updated and won't for a few days, the time every block was found went from every 120 seconds, to every 821 seconds. This is a predicted Nethash drop to around 165MH/s. This is an insane drop in Nethash, and shows just how many miners were ASIC miners. The pool I mine on alone saw a 50% decrease in hashrate. I saw no decrease in performance on my mining servers. The result is consumer hardware is significantly more profitable to mine on now, and the servers I built to mine Monero/Cryptonight currencies are gonna get bigger payouts than they ever have. I expect to see a steady increase in Monero's value over the coming weeks/months. Getting rid of ASIC miners is crucial to a currencies long term survival and for keeping the privacy attributes of Monero safe. Few people having the majority of the Nethash is very bad, and now its going to be a LOT more spread out. Funny thing, also "Monero Classic" is a thing now, a altcoin of Monero which ASIC's can still mine. However its missing many of some key privacy features meaning its pretty much a fake coin which is an attempt for ASIC manufacturers to keep selling the ASICs.
- 1 reply
-
- monero
- cryptocurrency
-
(and 3 more)
Tagged with:
-
I was at HackArizona 2018 this past weekend, and I participated in Raytheon's IoT hacking competion. The competition involved hacking into an August smart lock, a Ring doorbell, a WRT54GL router with updated tomato firmware, and a raspi C. To hack the router, we captured a packet with a handshake to the router and used aircrack-ng to run a dictionary attack using the rockyou wordlist against the router to find the password. The attack took about 40 minutes to complete, and it got me thinking about the hardware limitations of the laptop we were using. If you don't know, when you capture a wpa/wpa2 handshake, you are essentially capturing the password itself -- only it is hashed. When you run a dictionary attack like we did, you are running passwords from a wordlist through the wpa/wpa2 encryption and comparing the hashed result with the handshake you captured to see if they match. If they do, you 0wn3d that machine, brah. So my question is, since I am not very familiar with cryptocurrency mining hardware, would you be able to use a USB ASIC miner such as the GekkoScience to perform dictionary attacks like this against hashed passwords? I don't see a reason why you couldn't... but I don't want to spend a bunch of money on hardware that I won't be able to use.
-
Hope this is the right place to ask this question. Sorry for my english. I was wondring if it would be posible with a ASIC (I own a few antminers)to change OS from asic to difrent asics Example the OS From a antminer on a a4 dominator. And also is the algoritme set by the OS or the chips so could you use the OS from a sccrypth asic on a x11 asic?
- 2 replies
-
- asic
- operating system
-
(and 1 more)
Tagged with:
-
I haven't run across this yet and assumed all factory voltages to be the same for the RX480. I am running a pair of Sapphire RX480 8GB cards in crossfire and I noticed when I went to overclock them that the voltages were different between the two, then I checked each cards ASIC rating through GPU-Z (see attached pictures) It seems I lucked out with two pretty good chips thus far. Edit: Maybe this is why some cards draw over their rated TDP?
-
In GPU-Z it says that my GTX 950 has a 68.9% ASIC number. Is this considered good?
-
W1zzard over at techpowerup is looking for volunteers with Fiji based Graphic cards (Fury, FuryX, Nano) to help him get data for the GPU-Z tool. The original post is http://www.techpowerup.com/forums/threads/looking-for-volunteers-with-fiji-gpus-to-figure-out-asic-quality.220266/ send W1zzard a private message over at techpowerup.com and ask for special instructions to help him test Fiji ASIC quality. It's just a simple test that takes a minute. With enough volunteers maybe those of us with Fiji based cards will get to see our ASIC quality in the not-to-distant future.
-
Man Oh Man. My MSI GTX 970 gaming 4G has an ASIC quality of 59.8. Do I dare overclock this beast? It only gets up to 70C at standered clocks. It also has a pretty bad coil whine too. But that is just the sound of the Formula One engine reving inside my PC.
-
Hi All, first let me start by saying that I am fairly new to GPU OC'ing, and while I have been trying to read up on it, I still need some help with my particular situation. I am trying to test all my equipment before my grace period runs out, so wanting to do some OC'ing and benching while I can. I'll start by giving a run down of my newly built system (and appologize for the long post!! but some ppl seem to like a lot of detail): CPU: i7 6700k MB: MSI XPower Titanium OC Edition GPU: 1x Gigabyte Xtreme GTX 980ti (air) RAM: 16gig of G.Skill Trident Z 3000mhz (15 CAS) HD: 1) Samsung Evo 250gig SSD, 2) Intel 250 gig SSD, 3) 2TB Samsung 7200 RPM PSU: XFX Pro 750w Platinum Case: Corsair Air 540 ATX Monitor: 24" Asus 144hz 1080p Using MSI Afterburner I tested my card out of the box, and to my surprise, it said it was running at 1430 mhz boost clock, and 8009 mhz memory (well above the advertised base clock of 1216 and boost of 1317). I confirmed this with GPUZ, as I thought this must be a glitch. Although, after doing some research, I found that cards running above advertised speed is fairly common - or at least not unheard of if not common. The ASIC value reads at 76.1. However, of note, the main screen of GPUZ shows the advertised stock speeds (1216/1317) in both the normal and OC sections, and only shows the higher 1430 in the sensor section. Also of note, the card shows that is runs as low as 200 - 400 mhz during low load times (browsing the web), and then immediately jumps to the max boost during a demo like Firestrike. So, first set of questions is 1) is it normal for a card to run as low as 200 - 400 mhz? Is this a throttle issue, or some other malfunction? 2) If this is normal, what is even the point of the advertised "base clock", as it never really runs at that speed...as it appears to be always either under or over? 3) How can I test my card for throttle issues without buying a bunch of games (some people say it will throttle in certain games, but not in benchmarks, but I only have CS GO at the moment) and 4) should I underclock the card for gaming that doesn't need that much power to extend its life and stability (I would OC later when I need it)? Again, sorry if these are really stupid and basic questions! Given the ASIC, out-of-box clock speeds, and how cool it runs, I immediately tried the following OC settings using Firestrike...1490, 1525, and 1551. I used the stock voltage, offset, everything but increasing the clock. By the way, the stock voltage is 1.18. It ran perfect until I did the 1551, at which point it would give me a screen flicker every 10 seconds or so (like a green frame or two), but still ran fine for a minute or two (stopped it early since it obviously was not stable). Next, I tried setting the clock to 1535, and still got some screen flickering, although notably less. I then increased the memory to see if that would help to +100 (which I think might actually be +400 mhz since it seems to read the mhz at a quarter of the actual speed) and it crashed instantly, but when backed down to +40, seemed to actually help reduce the frequency of screen flickering (are these what ppl mean by artifacts?). I should also note that I saw the temps get in the 70s, so I changed the fan profile to be twice as much, and I have not seen it go over 54c since. Is it normal for temp to go in 70s at default clocks and fan speeds? Or should I keep the custom fan profile running to keep it lower? So my next set of questions is 1) should I try increasing the voltage to see if that will get rid of the flickering, or just use the target power % feature? If I should up the voltage, what voltage would be safe if I don't have a spare $660 around to replace the card and don't want to gamble with its function or longevity? 2) What temperature range should I shoot for? 3) What is the best way to proceed to test it reliability as well as OC ceiling? Also, I should mention this has LN2, which also has an option for an additional 6 pin input (but I don't have any more spare on my PSU at the moment). Lastly, I would be curious what people think regarding where my card falls in the overall spectrum of cards for OC ability...I have seen (in forum posts) ASIC scores range from the low 50% up to one person who claimed a 90%, and clock speeds ranging from mid 1300s to over 1600... with what appears to be a normal bell curve distribution between those values, but no one really says what is average or how to know if you hit the "silicone lottery." I definitely would not press my luck by returning it just for a better card, as I am very happy with it as it is (assuming no future issues discovered), but just curious what people think. Thank you, thank you, thank you in advance to anyone who takes the time to read and respond.
- 7 replies
-
- overclocking
- benchmarks
-
(and 8 more)
Tagged with:
-
How profitable is mining right now if I have access to free electricity? I am in college and pay rent with fixed bills for utilities included. Therefore, is it profitable? If so, what coin should I mine and what sort of miner speed would I require? Thanks in advance for the help.
-
My 2 new cards (Asus Strix 980TI OC) have an ASIC quality score of 83% and 87%. How does this compare with what some of you guys have? I did some research and it seems most people are in the 70-75 range but its hard to gauge if that is for reference cards or non-reference. I noticed EVGA charges a lot more for 80+ ASIC quality on their Kingpin. Did I get lucky or is this an expected range for non-reference cards? I had 2 non factory OC cards prior with Asic of 72 and 75.
-
Hey guys, I'm in a masters program for electrical engineering. As a part of a class I have to come up with project ideas for ASIC(Application specific integrated circuit) design. An ASIC could be any kind of integrated circuit to perform any task, which we would otherwise do with a controller, or a complex circuit. Basically anything that you could think of that requires a circuit to perform an application, we could make into an ASIC. A popular example around this forum might be ASICs that are used for mining cryptocurrency. I was hoping to get some inspiration /ideas for such a project. If you have any ideas that you think could be a cool invention like this, would be greatly appreciated. Thanks.
-
I've reinstalled gpu-z many times, can this be real? And if it is, is it common with 770s? EDIT: it also boosts to 1300mhz by itself without going above 79C even when i push the power to 110%
-
http://www.kitguru.net/components/cpu/anton-shilov/intel-first-full-custom-xeon-cpus-are-due-next-year/ For a long time now Intel has combined Altera FPGAs onto customized flavors of its top SKUs for the likes of Microsoft and Amazon in order to provide as much flexibility for acceleration of mission-critical algorithms as possible, but in the financial sector where microseconds and even nanoseconds matter to make lightning-fast trades based on the outcomes of those algorithms, IBM's Power and Oracle's SPARC lend themselves better, and it shows in the pricing. Intel now wants to take away one of its opponents' key advantages: building chips from the ground up alongside target clients, effectively getting ASICs with minimal latency attached to their CPUs, also saving on power and heat. With Intel going fully custom in the server market, we could see a shift in the balance of power between Intel, IBM, and Oracle that hasn't been felt since the original HPC war. Overall IBM and Oracle have very little marketshare, but in the areas where they dominate, it's practically a 50/50 split between the two at very high margins. It'll be interesting to see if Intel can break into IBM's last stronghold after many years of leaving Big Blue with the financial sector. I wonder if Intel would be willing to collaborate with AMD and put Fiji/Greenland onto future Broadwell EP/EX and Skylake EP/EX. It seems like a good way to keep AMD alive while still taking most of the immediate financial benefit. Hell, maybe Intel will work with Nvidia and create CUDA-based APUs for some in the scientific computing arena as old systems need to be upgraded to attract new customers. And as a far-fetched nonsense idea, what if this foreshadows Intel eventually going after AMD's console business 5-7 years down the road?
-
It seems that motherboard manufacturer Biostar has just announced the release of ASIC enabled motherboards. These boards that will be on sale will have ASIC chips, made specifically for mining, embedded in the board. The BTC-24GH, which contains 64 embedded ASIC chips, will have the output of 24 GH/s of a ashrate. This is the equivalent of 30 7970/280X GPUs all at once! Ever since the crash and now the slow rise of Bitcoin just recently, it will only be time to tell if this will be viable. With such a performance, it seems to be very well done by Biostar since they have so much power in a smaller space. Only thing is that the price is not specified and may be high due to miners. Source: http://www.techpowerup.com/198967/biostar-ready-to-launch-motherboards-with-built-in-asics-great-for-bitcoins.html
-
http://thegenesisblock.com/bitcoin-network-reaches-1-petahash-per-second/ Definitely impressive. I haven't been following bitcoin news for a long while. That increase since July is crazy. I guess that increase is from ASICs? I know they are custom-made specifically for mining, but they could cause that big of an increase?
-
I recently bought an EVGA GTX 780 SC. Came over from a Ghz 7970 to the green team. I am confused. BF3 crashes when I'm testing my clocks. I've got benchmarks to run as high as 1200 mHz according to what I set in Precision X, but all programs(GPU-Z/games) show 1110 mHz. I don't understand boost 2.0. Even now at the posted below settings in game shows 1110mHz. Seems stuck at this boost clock. ASIC-72.8% Core clock-1041mHz BOost clock-1093mHz Mem-1515mHz Custom fan curve, so never hits above 75 C. Using Precision X and OC scanner NVidia Beta driver Please show overclocking results and specify card model.
-
What problems would there be in running virtual machines on ASIC chips? Wouldn't it almost eliminate the speed issues when running a VM? Companies like AMD and Intel could licence out designs for processors, and they would still make money. It would also be great for the open source community: design open source hardware without paying high costs to have the hardware manufactured. I would think the advantages would outweigh the disadvantages. So why aren't we doing this already?
-
Sitting here trying to overclock my brand new Sapphire Vapor-x 7950, so far i've reached 1150/1450 at stock volt and it seems stable, now i discovered Asic-quality??? What does asic-quality means? I've googled it but no-one seems to know exactly what it's meaning or expain good. My card had an asic-quailty of 71% which i think seems low, most people with 7950's are getting arouind 80 from what i have seen. Thanks!